Internal documents revealed the worst for private communications in the EU; how will the Commissioners respond?

EDRi, Europe's biggest network for rights and freedoms across Europe and beyond, urge the European Commission to not put forward a CSAM proposal that would undermine the CJEU prohibition of general monitoring or subject Europeans to monitoring that would turn their devices into spyware.

By EDRi · April 22, 2022

On 17 March, EDRi was one of now 47 civil society organisations who called on the European Commission to ensure that the future ‘Legislation to effectively tackle child sexual abuse’ fulfils its goal of protecting children whilst also respecting the EU’s fundamental rights framework.

Since then, a shocking leak confirmed our worst fears that the proposal would force providers of messages, chats and web-based email services in the EU to apply dangerous and faulty scanning to all our private messages – even those that are encrypted!

As we have not yet had a response from any of the Commissioners, nor from Commission President Ursula von der Leyen, we decided to follow up to emphasise our concerns. Furthermore, we call on them to make sure that any legislation put forward by the EU meets fundamental rights criteria of necessity and proportionality – and furthermore, that it doesn’t compel European services to undertake practices that are illegal. In a democratic and rule-of-law-respecting Union, this should not be too much to ask.

So without further ado, here’s what we had to say:

Dear European Commission President Ursula von der Leyen,

We are writing to follow up on the civil society open letter that we sent you last month about the upcoming ‘Legislation to effectively tackle child sexual abuse’. The letter has now been signed by a total of 45 organisations working for rights and freedoms across Europe and beyond, with support continuing to grow.

We urge you to pay attention to the concerns that we raise, and to use your executive powers to ensure that the Commission does not put forward a proposal that would undermine the CJEU prohibition of general monitoring or subject Europeans to monitoring that would turn their devices into spyware.

In particular, we would like to highlight that since we sent you our letter, the leaked Opinion of the Regulatory Scrutiny Board (RSB) revealed the shocking news that the upcoming proposal seeks to mandate generalised scanning, even in encrypted environments. The method that will be used for such detection is ‘Client-Side Scanning’ (CSS), a fundamentally flawed method which experts agree undermines encryption:

  • In this paper, 14 of the world’s leading cybersecurity experts explain how CSS leaves every person’s devices vulnerable to hackers, criminals and other malicious actors
  • In this article, two Princeton computer scientists explain that CSAM scanning in encrypted environments is fundamentally and unmitigably technically flawed.
  • In this article, security experts at EFF explain further risks of CSS

Furthermore, the RSB Opinion revealed that ‘unknown’ content would also be in scope of these rules. We remind you that AI tools are notoriously inaccurate at such tasks, and poor at interpreting context. Such requirements would not only be very invasive, but would expose people to large numbers of erroneous reports. For example, in a period of just 53 days in 2021, Meta found that their systems had wrongly flagged at least 207 people of exchanging CSAM in the EU, with an additional 4,500 people lodging complaints.

Scaled up across time and multiple services/platforms, this volume of wrongful accusations could amount to a vast number of falsely-accused people. Because of the scale of private messaging in the EU, this is the case even if – as some providers of such technologies claim – their systems have a high theoretical accuracy. However, we have many reasons to doubt this claim of high accuracy. In 2021, for example, LinkedIn found that only 31 of the 75 files identified by PhotoDNA as CSAM were actually CSAM; meaning the rate of matches was only 41%.

We believe that if the upcoming legislation mandates generalised scanning, the Legislation to effectively tackle child sexual abuse will not be fit for purpose, and will further be vulnerable to legal challenge.

If the European Commission decides to propose a long-term derogation of the ePrivacy Directive, it needs to ensure that highest standards of necessity and proportionality. Furthermore, current “voluntary” practices by providers should be revised by data protection authorities (DPAs) under guidance of the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS). If data protection authorities consider these current practices legal, they should be limited to scan for known images, with seriously enhanced mechanisms for transparency, accountability, oversight and human rights safeguards compared at least to the interim Regulation.

  • Find all our content related to this proposal in our new document pool (EDRi, March 2022)