The CSA Regulation: how did it reach this point?

How did we reach this point of even discussing a law (Child Sexual Abuse Regulation) that so manifestly undermines our democratic structures, threatens to override the fundamental rights that generations have been fighting for, and ignores solid evidence and unanimous professional expertise?

By EDRi · May 11, 2023

On 11 May 2022, the European Commission put forward a proposal for a law to ‘prevent and combat child sexual abuse’ (CSA Regulation). Proposing rules to force social media platforms, email and chat providers and other digital services to monitor the public and private communications of potentially all of their users, the law’s architect, Home Affairs chief Ylva Johansson, claimed that her law is the “only way” to keep children safe online.

The CSA Regulation proposal has been a controversial piece of legislation from the very beginning. We are seeing a democratic scandal unravelling before us, as groundbreaking decisions are being taken out of a false sense of urgency, rather than good evidence and solid processes. The handling of the CSA Regulation on an institutional level has brought to the surface a long history of poor-quality policymaking. This risks severely damaging the democratic values and foundations the Union has been built upon.

As experts in online regulation, like the EU’s top data protection authority warn, supporting this law will not bring us closer to keeping children safe. It will only legitimise scandalous practices that have no place in the EU and will put the democratically-elected Members of the European Parliament in the uncomfortable position of trying to find a ‘solution’ to make mass digital surveillance acceptable. The CSA Regulation proposal as put forward by Johansson and her staff (DG HOME) has turned the conversation into a zero-sum game between fundamental rights, which sadly will not contribute to the better protection of anyone – especially the children this legislation is supposed to help. 

Whose voices are being silenced?

Civil society

To start with, child protection largely falls under the sole responsibility of the EU’s Member States. To circumvent this, the current proposal has been positioned within the EU’s competence to regulate the internal market instead. In particular, the law covers how platforms such as Signal, Facebook, WhatsApp and Instagram operate to scan for and flag child sexual abuse material (CSAM).

This is to say that the foundation of the CSA Regulation is based on the market legislative operation, in specific regulating digital platforms in order to remove barriers to their operation. Therefore, the underlying expertise in this context is clearly technology and digital rights experts who are well aware of existing regulatory frameworks and understand the technological challenges the proposal sets

However, since the very beginning, digital rights organisations like EDRi have been excluded from the conversation. EDRi and 133 other civil society organisations, including several representing children and young people’s rights, called for the proposal to be withdrawn for its fundamental incompatibility with EU rules and values; the EU’s top data protection authorities jointly warned that the proposal would amount to mass surveillance; and technologists widely decried its lack of even the most basic technical understanding. 

Experts have tried to point Commissioner Johansson’s attention to the mountain of evidence that clearly shows how the technical solutions her proposal suggests cannot help to eliminate, mitigate or reduce threats to children’s safety. But these efforts have been thrown to the wind as DG HOME has dismissed all concerns raised by the experts on this law as simple “disagreement”, and even outright rejected to meet with digital rights groups. 

EU institutional expertise

Not only has Johansson ignored technology and digital rights experts, but also the European Commission’s own internal group which comments on the legitimacy of new initiatives. The “Regulatory Scrutiny Board” raised alarm bells that the proposal might violate the EU’s prohibition of general monitoring before it was even published. 

We have also heard national members of parliament in Austria, France, the Netherlands and Ireland expressly call on their governments not to agree to this new law

An independent study from the European Parliament has found that the proposal amounts to significant and unjustifiable violations of the human rights of potentially hundreds of millions of people, without evidence that it will be effective in its purported goals to protect children. The study was met with a comment from a Member of the EU Parliament (MEP) that this level of broad and diverse criticism of an EU proposal is almost unprecedented.

And most recently, as reported by The Guardian, a Council leak revealed the legal service of the council of the EU, the decision-making body led by national ministers, has advised that the proposed regulation poses a “particularly serious limitation to the rights to privacy and personal data” and that there is a “serious risk” of it falling foul of a judicial review on multiple grounds. 

Police forces

Police forces have also spoken up as the direct body working to protect children, saying that the proposed draft law will make their job more difficult and less effectiveDutch and German police as well as German public prosecutors warned that the law would make it harder to find and prosecute perpetrators.

Industry

We also see a clear message coming from the industry, which is the main target of this law, that Johansson’s proposal cannot be technologically implemented. For example, 9 European industry associations representing members like Google, Meta, Microsoft and Mozilla have voiced their concerns, drawing specific points on the necessity and importance of encryption. However, Johansson refuses to hear what experts say regarding the severe damage that breaking encryption will have on the internet infrastructure, and on global efforts to create online spaces that enhance trust, privacy and freedom of expression.

Instead, celebrities with no understanding of the right to privacy nor EU human rights rules are given a wide platform to amplify Johansson’s agenda. But as Meredith Whittaker, Signal’s President, has pointed out, what the proposal really does is open the door for mass surveillance.

“I’m not a celebrity or an influencer, but I do know tech, and I will state for the record that there is no such thing. It’s simply not possible. And either these people are badly misinformed, in a deep and concerning state of denial, or dangerously cynical – hoping that by promising a nonsense tech solution they will get laws passed and implement surveillance before anyone is the wiser.”

Meredith Whittaker, Signal’s President

Meredith Whittaker, President of the Signal Foundation

CSA Regulation: Law for children that cannot offer real protection

Despite the Court of Justice of the EU requiring governments to strike a balance between human rights when they are in conflict, DG HOME failed to properly assess and analyse the rights to online privacy and freedom of expression not just of children, but also potentially all internet users. The proposal paints a picture of the internet as an almost unequivocally dangerous place, failing to consider the many benefits and positives of the internet and of digital communication tools, and the people who rely on them every day. 

This lack of balanced and comprehensive human rights assessment was one of the key reasons why the European Parliament felt it necessary to commission a “Complementary Impact Assessment” to deal with these issues that DG HOME had all but ignored.

The right to privacy and protection of data online is above all essential for children’s safety. Online spaces provide extraordinary opportunities to connect with others and to build safe spaces for exploration. Empowering and safe spaces can only be possible with privacy. For young people, the scanning of their private communications, pictures and searches online, turns normal activities into risky business. It can have serious consequences, in particular for queer, racialised or other marginalised youth. 

The United Nations and UNICEF state that online privacy is vital for young people’s development and self-expression, and children should not be subjected to generalised surveillance.

The UK Royal College of Psychiatrists highlights that snooping is harmful to children and that policies based in empowerment and education are more effective.

Children themselves say that they would not feel comfortable and safe being politically active or exploring their sexuality if authorities were able to monitor their digital communication, in order to look for child sexual abuse material

Child rights expert Dr Sabine K Witting from the University of Leiden has also very clearly stated that the search for new material would seriously violate children’s sexual self-expression and must be removed.

“[…] the CSAM Regulation seriously violates the rights of the young people concerned, especially in the area of ​​privacy and thus in one of the most protected areas of the right to privacy. The only way to prevent this would be to radically change the CSAM regulation: remove orders for the detection of unknown abuse material from the draft regulation altogether.”

Dr Sabine K Witting, University of Leiden

The time to act is now

The credibility of the EU as an international leader in human rights and data protection – thanks to a Commissioner who categorically refuses to hear any criticism of ‘her’ law – is at stake. It’s now up to the Members of the European Parliament to stand up for the democratic rights and processes upon which the EU is built. Voting in favour of this law would mean putting people across the EU – and the world – in great danger. Children deserve better.

What MEPs can do to protect children online is to ensure that this law is not passed in its current form. Ahead of the European Parliament elections, over 10 000 people from the EU are asking MEPs to do so.

Join the Stop Scanning Me movement of +130 NGOs and +10 000 individuals to protect children’s privacy and safety online by urging the European Parliament to reject the CSA Regulation proposal

Join us!