Commissioner Johansson cannot be trusted with the EU’s proposed CSA Regulation

In the midst of a wide range of concerning practices and behaviours, EDRi has found it necessary to raise a formal complaint against the EU’s Home Affairs department for possible breaches of independence.

By EDRi · May 12, 2023

A recent opinion from EU Council Legal Service becomes the latest authority to point out the unlawfulness of the CSA Regulation, supporting earlier assessments by the EDPS. This comes after independent researchers at TU Delft have pointed out the inaccuracies and inconsistencies in many of the public claims made by lead Commissioner Johansson. In the midst of a wide range of concerning practices and behaviours, EDRi has found it necessary to raise a formal complaint against the EU’s Home Affairs department for possible breaches of independence. As civil society, we play a role as watchdogs of the EU institutions. It is vital to hold them to account for conduct that puts the integrity and credibility of EU lawmaking processes at risk.

EDRi holds DG HOME to account

EDRi has now made a formal complaint to the European Commission about recent conduct which we believe violates their internal requirements for impartiality, and possibly also transparency. We’ve been very vocal about the fact that while Johansson rejected meetings with EDRi a grand total of three times, she was frequently meeting with big tech companies.

As reported by Wired, DG HOME and Johansson have continuously ignored key voices, including civil society. Representing the voices of 47 human rights organisations across Europe and beyond, EDRi is a key stakeholder. Our concerns, however, have been repeatedly dismissed in public fora by Commissioner Johansson.

In March 2023, DG HOME published a series of communications encouraging the public to sign a petition in support of the law. Despite many civil society groups opposing the law, including petition from fifteen human rights and justice organisations in the Stop Scanning Me EU campaign calling on the EU to withdraw the proposal, and a complementary demand from 134 civil society groups. DG HOME decided to present only one perspective on the debate. This is despite the Commission’s mandate requiring independence.

What’s more, the petition that DG HOME chose to promote is run by advocacy groups, including several that – despite their clear intention to influence EU institutions – are not, at the time of writing, listed in the EU’s transparency register.

By promoting certain parties to the debate whilst dismissing others, DG HOME is presenting the lobbying activities of certain stakeholders in a way that could influence the legislative process. Rather than respecting their role as neutral servants representing the best interests of the EU, DG HOME in general, and Commissioner Johansson specifically, have become some of the most urgent and one-sided voices lobbying for a law that will treat everyone as a suspect and will enforce mass surveillance of almost our entire digital lives.

The false claims of Commissioner Johansson

EDRi has conducted an investigation into Johansson’s claims about the CSA Regulation and – based on solid evidence – has assessed that Johansson is repeatedly serving inaccurate and misleading claims about how technology works, what the law could really achieve in practice, and what current detecting practices have shown.

Tools that scan for CSAM (child sexual abuse material) are NOT highly accurate

Johansson has numerously said that tools that can scan for CSAM are highly accurate. For example, in a presentation to and exchange with LIBE MEPs in late 2022, Johansson promised that CSAM scanning tools have an accuracy of 99.99%.

Reality: This is not the case, with significant variation across methods. What’s more, the data have never been verified other than by the commercial entities (Microsoft and Safer – the company behind Thorn) who sell the technology worldwide.

The industry itself disagrees with the figures

  • These figures have been disputed by one of the very tech companies from which they came. In their response to the public consultation on the CSA Regulation, Microsoft pointed out that the figure of 88% accuracy of grooming tools claimed by DG HOME, which came from their research, was not a reliable indicator;

These claims have never been validated – not even by the Commission

  •  In a Freedom of Information (FOI) request filed by Felix Reda in 2022, the Commission admitted that the numbers had never been tested or even verified, and instead were just supplier claims taken at face value.

The Impact Assessment is a lot more cautious about PhotoDNA than Commissioner Johansson (and its developer, Hany Farid) makes it seem

  •  The Commission’s own Impact Assessment, and Professor of Cryptography, Matthew Green, have warned about the risk of PhotoDNA or other scanning tools being ‘poisoned’ to deliberately trigger false positives (innocent material being flagged as CSAM). An article from PhotoDNA user Cloudflare shows the ease of generating false alerts.

At scale, the statistics are not as good as they seem

  •  99.99% accuracy (or more accurately, precision) is not a reliable level of precision when we are talking about the volumes of messages, images etc that are exchanged via digital communications channels every day. For example, it is estimated that 4.5 billion images are exchanged on WhatsApp alone each day. If just 1 in 10 of these were exchanged in the EU, even at 99.99% precision, that would still amount to 45,000 false alerts per day. And that’s just one platform.
  •  Additionally, current practices show a high level of inaccuracy when it comes to flagged CSAM. LinkedIn’s use of PhotoDNA in 2021 showed that of the material flagged by PhotoDNA, only 41% was actually CSAM. This suggests that the detection tools are used in practice with different thresholds for a match with CSAM than the threshold used to claim a theoretical accuracy of 99.99%.

Automated tools to scan messages are NOT safe

In the debate with Members of the European Parliament (MEPs) from the LIBE Committee in 2022, the Commissioner refuted arguments that scanning puts people’s privacy, free expression and presumption of innocence at risk, claiming that “if that was the case [that scanning has serious risks] it would already happen right now.”

Johansson has repeated several times that these tools have been in use for 10 years without any problems, e.g. at the MEP hearing: “it’s been going on for 10 years. […] this is actually totally false to say that with a new regulation, there will be new possibilities for detection that don’t exist today.”

Reality: There is already evidence of people who have been seriously harmed by the false alerts, so it is misleading to suggest that there has never been an issue with scanning tools. What’s more, the CSA Regulation wants to go a step further to also scan in encrypted environments and to make reporting to law enforcement mandatory even for consensual intimate images sent between teenagers above the age of majority, for example, which would increase the risks even more.

Several Court cases have emerged of people whose digital lives have been unfairly erased because of false accusations

  •  This is not just the case of being unable to access a document: this is people’s personal and work emails, or their entire cloud photo storage, which have been blocked and not reinstated even after they have been cleared of wrongdoing. For example, two cases in the Netherlands relating to Microsoft can be seen here and here

The Irish police likely illegally retained data on at least 500 people who had been mistakenly flagged for sharing CSAM when actually sharing innocent material, such as family pictures

The proposed CSA Regulation will not be able to successfully mitigate these risks

  •  The proposal requires providers and also the EU Centre to forward all content to national law enforcement unless it is “manifestly unfounded” to be CSAM. This is a very high threshold that means that legally, providers and the EU Center must pass any content to national law enforcement unless they are completely sure it is not CSAM (i.e. because it is a picture of a kitten). This contradicts a claim from Johansson in the aforementioned MEP hearing that “the EU Center will do the filtering. [It] will make sure only the real reports go to police.”

Platforms have had to reinstate thousands of accounts of people who were blocked for false accusations of sharing CSAM under the interim derogation

  •  This is visible in their transparency reports (e.g. Meta) on the basis of the CSA Regulation. 

Tools to safely scan encrypted messages do NOT exist

In the meeting with MEPs, Johansson claimed that services like WhatsApp “are doing detection in encrypted communication today” to look for malware, and that this is evidence that scanning doesn’t undermine encryption. The Commissioner also told MEPs that “I don’t consider it [the proposal] breaks encryption.”

Reality: There are not currently any companies that scan encrypted messages in the way that is proposed by the CSA Regulation. Technologists warn that this cannot be done whilst upholding the integrity and security of a service.

The Commission’s own ‘expert’ review shows that scanning tools fail to meet acceptable levels of security, privacy and feasibility

  •  See the Impact Assessment, especially the expert group annex (annex 9, pages 285 – 315). What’s more, 30 leading experts in the field have now stated that not one of them was consulted for this, but that they unequivocally see that the current proposal would break end-to-end encryption and is based on very risky technologies.

Referring to the suspicious link warning function of WhatsApp as malware detection is technically misleading

  •  This is a simple rules-based function to look for patterns and warn the user, and does not constitute content scanning and reporting in the way that would be required by the Commission’s proposal. To implement that, experts have been unanimous that Client Side Scanning would be required, which is currently not being used by any providers, and completely failed when Apple tried to implement it in 2021.

Apple abandoned all plans for client-side scanning in December 2022

  •  A more limited client-side scanning system was proposed by Apple in August 2021. After substantial criticism from security experts, Apple put the plans on hold, and in December 2022 officially announced that the project was abandoned.

 A new paper from the University of Amsterdam Institute for Information Law (IviR) yet again reiterates that in order to carry out a Detection Order in an encrypted environment, providers of encrypted services would have to alter encryption, or alter the device of their users. 

If the EU does not pass the new law by the summer of 2024, children will NOT be left without any protection online.

The Commissioner warned MEPs in the LIBE Committee that if they do not pass the Regulation before summer 2024, children will be left with no protection online: “there is a huge risk of a gap.” These claims have been repeated many times by Johansson and DG HOME staff members.

Reality: The team of Commissioner Johansson has already informed Member States that the interim regulation can easily be extended if needed due to lack of agreement on the CSA Regulation. Therefore, the manufactured sense of urgency is not only unhelpful, but it is also likely to be counter-productive to finding a robust, reasonable, rights-protective solution. 

As confirmed in writing to the German government by DG HOME, it is possible to extend the interim derogation

  •  Most relevant parts (in translation): 

    – Zu DEU Frage 10: “Can COM [the European Commission] confirm that providers’ voluntary search for CSAM remains (legally) possible? Are there plans to extend the interim regulation, which allows providers to search for CSAM?”
    DG HOME: “A permanent and clear legal basis is required. Regulatory gaps after the expiry of the Interim Regulation must be prevented. It is still too early to determine what the transitional period until the entry into force of a CSA Regulation could look like; an extension of the Interim Regulation is also possible. Hosting service providers not covered by the ePrivacy Regulation and therefore not affected by the expiry of the Interim Regulation could continue to take voluntary measures.”

The CSA Regulation is intended to address the important issue of keeping children safe online. But it has become a tool in the hands of Ylva Johansson, who has foregrounded her political agenda at the expense of finding feasible and effective solutions. As already discussed by EDRi, solid evidence supported by experts’ opinion show that this law will not bring us closer to keeping children safe. It will only legitimise scandalous practices that have no place in the EU.