Private communications are a cornerstone of democratic society and must be protected in online CSAM legislation

On 17 March 2022, EDRi and 34 other civil society organisations jointly raised our voices to the European Commission to demand that the forthcoming EU ‘Legislation to effectively tackle child sexual abuse’ complies with EU fundamental rights and freedoms. We are seriously concerned that the draft law does not meet the requirements of proportionality and legitimacy that are rightly required of all EU laws, and would set a dangerous precedent for mass spying on private communications.

By EDRi · March 17, 2022

On 17 March 2022, EDRi was one of 35 civil society organisations jointly raising our voices to the European Commission to demand that the forthcoming EU ‘Legislation to effectively tackle child sexual abuse’ complies with EU fundamental rights and freedoms. We are seriously concerned that the draft law does not meet the requirements of proportionality and legitimacy that are rightly required of all EU laws, and would set a dangerous precedent for mass spying on private communications. This blog provides further information about our demands.

***The accompanying letter remains open for new signatures and has now been signed by 43 organisations. You can add your civil society organisation’s signature now.***

 

What we want, and why you should care

People around the world rely on their private communications for everything from chatting with friends and family, to contacting doctors and lawyers, to blowing the whistle with journalists and organising for social change. What’s more, children in vulnerable positions may actually suffer from weakened technological protections which may prevent them from having confidential communications needed to escape abusive situations. Both UNICEF and the United Nations have issued reports and comments on the importance of privacy and data protection for young people.

We urge the Commission to heed the warnings of our letter, and ensure that any proposal to tackle the online dissemination of child sexual abuse material:

  1. Does not contain any provisions which could lead to or force mass surveillance;
  2. Allows only targeted interventions based in law and with judicial oversight; and
  3. Ensures measures are narrowly restricted and the least-privacy invasive as possible.

Anything less would fly in the face of EU fundamental rights, democracy and the rule of law.

The forthcoming proposal to tackle child sexual abuse online

The European Commission is currently scheduled to launch their proposal for a new law to replace last year’s short-term ‘ePrivacy derogation for the purpose of tackling child sexual abuse material (CSAM)’ on 30 March 2022 (although further delays are possible). At the time of the short-term legislation, EDRi was active in raising our concerns that the law would (and did) allow companies to spy on everyone’s communications.

The new legislation will focus on two main pillars: firstly, it will put forward obligations on all internet service providers offering chat or message services to detect, report and remove CSAM. Compared to last year’s voluntary measures, this new proposal will likely force all companies providing chat or web-based email services to scan every person’s private conversations. Secondly, the proposal will create a European Center to centralise these activities, as well as to increase certain prevention efforts. It has been referred to as a European version of the US ‘National Center for Missing and Exploited Children’, NCMEC.

The proposal is led by Commissioner Ylva Johansson, the EU’s top bureaucrat for Home Affairs, internal security and migration. She has led several EU initiatives which EDRi has strongly criticised for their lack of fundamental rights protections, including recent new laws giving a “blank cheque” to the European police agency Europol.

Despite our request for a meeting to raise our concerns that the Commission’s proposed ‘solutions’ to the issue of CSAM online could have a severe impact on freedom to seek, receive and impart information worldwide, and to collaborate on ensuring that the plans meet fundamental rights standards, the Commissioner declined to meet with the EDRi network.

Furthermore, her comments have sparked great concerns among all of us that care about maintaining open and democratic societies. If adopted, this law would make the EU a world leader in the generalised surveillance of whole populations. How, then, would the EU be able to speak out when undemocratic regimes enact the same measures?

Mandating the impossible?

The crux of the issue is that this proposal presumes that technology can offer a quick fix to an issue that is in fact deeply complex, and goes far beyond the remit of what technology is feasibly able to achieve. Although digital technologies facilitate the spread of CSAM, the crime of child sexual abuse stems from human actions. If the aim of this new legislation is “to effectively tackle child sexual abuse”, then to be truly effective, it needs to focus at least as intensively on where the criminal activity originates, and to stop criminal acts of abuse before they happen.

What’s more, technologies which claim to be useful for preventing the spread of such content are highly flawed and inherently limited. That’s why our letter emphasises that:

[T]here is no way to give law enforcement exceptional access to communications that are encrypted end-to-end without creating vulnerabilities that criminals and repressive governments can exploit.”

And further that:

Measures which break or undermine encryption (such as Client-Side Scanning); which are experimental or inaccurate; or which create cybersecurity risks will always create far more problems than they can solve.”

Forcing providers of chats, messenger services and emails to do something that they technically cannot do – at least not accurately, and not without jeopardising the safety and security of hundreds of millions of people and putting the very essence of free speech at risk – would undermine the EU’s own values. It would also go against the Commission’s commitment to base laws in evidence and proportionality, instead instrumentalising fear and emotions to drive legislation.

It is likely that the proposal will not explicitly require providers to use any particular technology in order to meet their detection or reporting obligations. Furthermore, Commissioner Johansson confirmed to MEP Patrick Breyer that the proposal will not “prohibit or generally weaken encryption”. However, this is a deliberate strategy to put the responsibility onto providers, which we have seen before in the proactive obligations in EU files on copyright, terrorist content and even the Digital Services Act (DSA).

Furthermore if, as we expect, the proposal makes service providers liable for outcomes that would in effect require them to use technologies like Client-Side Scanning (CSS), or AI-based scanning, or generalised monitoring of content, then this would undermine the essence of end-to-end encryption. This would be equally as bad as if the law explicitly required providers to do these things (for example by prohibiting encryption or mandating CSS). It also shows that the proclaimed commitment to technological neutrality is misleading. That’s why our letter warns that:

The Legislation to effectively tackle child sexual abuse must not compel service providers to take steps or to ensure outcomes that would in effect force them to conduct such practices [that would undermine encryption or constitute generalised monitoring].”

Read the open letter to the European Commission here.

Ella Jakubowska

Policy Advisor

Twitter: @ellajakubowska1

Read more: