Irish and French parliamentarians sound the alarm about EU’s CSA Regulation

The Irish parliament’s justice committee and the French Senate have become the latest voices to sound the alarm about the risk of general monitoring of people’s messages in the proposed Child Sexual Abuse (CSA) Regulation.

By EDRi & ICCL (guest author) · May 3, 2023

The Irish parliament’s justice committee and the French Senate have become the latest voices to sound the alarm about the risk of general monitoring of people’s messages (ChatControl) in the proposed Child Sexual Abuse (CSA) Regulation. This draft European Union (EU) law, currently being debated by European Members of Parliament (MEPs) and EU home affairs ministries, seeks to tackle the spread of abuse imagery (CSAM) online across the bloc.

However, Europe’s top data protection regulators134 civil society groups, and independent experts for the European Parliament all warn that the proposed measures go far beyond what is acceptable in a democratic society, lack objective evidence of effectiveness, and may even do more harm than good for those it is supposed to protect. Co-lead MEP Alex Saliba has also highlighted the threat posed to secure digital communications and the risk of mass surveillance: “either we support secure encryption services and the prohibition of general monitoring, or we don’t.”

Now, French and Irish parliamentarians join the chorus in calling on the EU to ensure that proposed measures are compatible with the EU’s human rights framework. Otherwise, this law risks being thrown out in court for violating the privacy, security and free expression of people online – and that includes the very children that the law is supposed to help.

Irish Parliament warns against indiscriminate scanning of people’s private digital lives

The Irish justice committee has made a welcome and significant intervention against ‘ChatControl’ by calling into question the effectiveness of the proposal and highlighting how it could disproportionately intrude on the fundamental rights of everyone using digital communication in Europe.

In a nutshell, the Irish committee has strongly warned President of the European Commission Ursula von der Leyen; President of the European Parliament Roberta Metsola; and President of the European Council Charles Michel, that the proposed law would:

  • Be unprecedented in requiring indiscriminate scanning of digital communications, threatening the safety, privacy, and freedom of expression of everyone;
  • Undermine the security of our communications and online services;
  • Likely lead to police authorities across the European Union becoming inundated with false positives. The flagging of so many false positive referrals could ultimately result in vital police resources being exhausted, putting children who have been abused at greater risk; 
  • Increase the use of error-prone technology that already incorrectly flags innocent people as sharers of CSAM in a manner that is “extremely intrusive” to people’s rights;
  • Likely be struck down by the courts because it doesn’t comply with certain rights guaranteed by the Charter of Fundamental Rights of the EU, including the rights to privacy, protection of personal data and freedom of expression and information; and
  • Place a significant burden on Irish national authorities, given many Big Tech companies are located in Dublin.

The intervention and concerns about proportionality from the Irish committee follow figures released last October by EDRi affiliate the Irish Council for Civil Liberties which show that of the number of suspect referrals sent to Irish police in 2020, by the U.S. National Center for Missing and Exploited Children (NCMEC), more than 11% (471 referrals) were not CSAM, while 9.7% were ‘actionable’. There are no figures available for the number of referrals which resulted in prosecution and/or conviction. However, the 11% false positive rate is starkly different to the European Commission’s claims that the accuracy of scanning technology for known CSAM is 99%. More worryingly, in Ireland, the police retain personal data about people wrongly flagged, as reference and intelligence material in respect of future investigations.

French Senate warns of unlawful general monitoring and data retention

In a resolution calling on the national government to negotiate for the better protection of fundamental rights in the EU’s CSA Regulation, French parliamentarians have raised significant concerns about the scope and measures currently on the table. Adopted by the Senate on 20 March 2023, the Resolution was proposed by conservative and Macronist MPs, who emphasise that the protection of children online is of vital importance, but that there are serious issues with the proposed approach.

In particular, the Resolution questions the likely effectiveness of the proposed framework, and the lack of legal clarity about the scope of proposed Detection Orders. It states that this makes the risk of general monitoring unacceptably high. Pointing to case law of the Court of Justice of the EU (CJEU) to back up this point, the Senate reiterates that any restriction on fundamental rights needs to be necessary and proportionate, and that in the case of the CSA Regulation, there is currently a disproportionate risk of generalised and permanent surveillance, in violation of the fundamental EU human rights to privacy and confidentiality of communications.

The Resolution also points to the danger that the CSA Regulation would undermine end-to-end encryption, which the parliamentarians remind us is a human rights tool which needs to be strongly protected. Lastly, they warn about the immaturity and inaccuracy of tools designed to identify new CSAM or grooming, calling for these to be removed from the law, and for more research to be conducted into rights-respecting detection tools. Finally, the Senate calls for more accountability for platforms, for investment in prevention programmes, for privacy by design measures, for education of young people, and other holistic measures to better combat the fight against CSA.

Commission’s own Regulatory Scrutiny Board raises concerns over mass surveillance

The concerns raised by the Irish justice committee and the French Senate echo similar issues raised by the European Commission’s own Regulatory Scrutiny Board (RSB), the body responsible for assessing whether a legislative proposal is necessary and proportionate according to human rights law. The RSB has already pointed out that parts of the proposed law would likely amount to generalised surveillance, which contravenes the EU prohibition of general monitoring. This is a risk that has also been emphasised by the United Nations High Commissioner for Human Rights.

Contribution by: Ella Jakubowska, Senior Policy Advisor, EDRi, and Olga Cronin, Policy Officer, EDRi affiliate ICCL

Raise your voice against ChatControl in the EU by joining the Stop Scanning Me movement

Join the 133 NGOs & over 10 000 individuals part of the movement