Leaked opinion of the Commission sets off alarm bells for mass surveillance of private communications

A newly-revealed Opinion of a European Commission review board about their own colleagues’ upcoming proposal for a ‘Legislation to effectively tackle child sexual abuse’ shows strong concerns with the legislative proposal. Leaked by French media outlet Contexte yesterday (22 March), and dated 15 February 2022, the Opinion confirms the fears EDRi and 39 other civil society groups recently raised about the proposal which could destroy the integrity of private online communications across the EU, and set a dangerous precedent for the world.

By EDRi · March 23, 2022

Reservations”. Significant shortcomings”. “Efficiency and proportionality […] not sufficiently demonstrated.” “Options […] are not presented in a sufficiently open, complete and balanced manner.”

It might sound like we are talking about an inquiry into a dodgy business deal or some sort of murky political scandal. But in fact, what the above sentences refer to is a newly-revealed Opinion of a European Commission review board about their own colleagues’ upcoming proposal for a ‘Legislation to effectively tackle child sexual abuse’. The proposal is currently scheduled to be published on 27 April 2022, although further delays to May are likely. The proposal focuses on curbing the online spread of child sexual abuse material.

Leaked by French media outlet Contexte yesterday (22 March), and dated 15 February 2022, these worrying statements from within the European Commission are just the tip of the iceberg for a legislative proposal that EDRi and 39 other civil society groups recently warned could destroy the integrity of private online communications across the EU, and set a dangerous precedent for the world.

In meetings, the staff of Commissioner for Home Affairs Ylva Johansson, who leads the file, reassured us that the new law would not contain requirements for generalised scanning, and further that it would not touch encryption. But the results of the ‘Regulatory Scrutiny Board’ (RSB) who conducted the internal review tell a very different story:

“The report [on the Legislation to effectively tackle child sexual abuse] is not sufficiently clear on how the options that include the detection of new child sexual abuse material or grooming would respect the [EU] prohibition of general monitoring obligations.”

“In view of the assertion […] about the limitations of available technologies that exist for the use in encrypted communications […] the report should be clearer about the practical feasibility of the policy options and provide reassurance about the effective application.”

“The report should clarify how the options that include an obligation to detect new child sexual abuse material or grooming would respect privacy requirements, in particular the prohibition of general monitoring obligations.”

What this tells us is that the current draft of the Legislation, from Commissioner Johansson and her team in DG HOME, contains rules which would force online communications service providers to conduct the generalised monitoring of people’s private communications – even those that are encrypted. Furthermore, the Opinion notes the illegality of general monitoring under EU law, meaning that if it goes forward, the proposed law could potentially be taken down by the Court of Justice.

Earlier this year, EDRi warned via our ‘Chat Control 10 Principles’ that generalised monitoring would amount to undemocratic and unlawful mass surveillance. EDRi recommended instead that all interventions into people’s confidential chats are targeted based on reasonable suspicion, judicial warrants, and proper safeguards, in line with EU fundamental rights law.

Furthermore, we cautioned the Commission not to weaken, undermine or circumvent encryption. Encryption is vital to protect everything from online shopping to national security, and some of the world’s leading cyber-security experts recently warned that forcing the scanning of encrypted environments neither guarantees efficacious crime prevention nor prevents surveillance. Indeed, the effect is the opposite.

What’s more, the Opinion indicates that the draft law would also require this generalised monitoring to be done not just for material that has been assessed by authorities to ensure that it is unlawful, but also to search for “unknown” images as well as so-called evidence of “grooming” using notoriously unreliable AI-based tools. We’ve all seen pictures being automatically flagged on social media because an AI tool wrongly thought that the picture contained nudity, and have all suffered the frustration of an important email automatically going into your spam folder.

These consequences are bad enough – but now, imagine if the consequence is not just a lost email, but rather a report to the police accusing you of disseminating illegal child sexual abuse material or grooming a child. The inevitable result of such technologies would be unthinkable for those that are wrongly accused.

The report also highlights a serious deficit in the Commission’s supposedly evidence-based policy-making:

“The preferred implementation option should not be identified upfront, but emerge as result of an analytical assessment and comparison process.”

This tells us that DG HOME already had their preferred approach, and sought to make the rest of the proposal fit that, rather than objectively analysing the situation. Given the lack of evidence and proportionality of the short term version of this law, we are sad to say that this does not surprise us.

The Opinion also highlights that views from stakeholders – including those that have raised concerns about the proposal – “should be presented in a more transparent way”. This reinforces why it is so important that we stand up for our privacy while we are still able to.

The million-euro-question, then, remains: if all of these serious issues still plague the proposal, why was it allowed to pass the scrutiny board? And what does this assessment say about the legality of the already existing temporary legislation?

The Opinion says that the Board’s approval has reservations because the Board “expects the DG to rectify” the “shortcomings” before the final proposal. This hardly seems an appropriate or reliable approach, given that the Opinion reveals a systematic failing of DG HOME to put forward a proportionate, evidence-based and fundamental rights-respecting proposal. What’s more, the leaked document shows that an earlier version of the proposal failed its review last year. Vital lessons, including about the need to respect privacy, are clearly not being learned.

On 9 March 2022, Commissioner Johansson wrote to Members of the European Parliament reassuring them that her proposal “would not prohibit or generally weaken encryption. The Commission is not considering proposing any mechanism or solutions in its proposal that will break this commitment.” To the contrary, what’s clear from this leaked Opinion is that – as we explained in a recent blog – undermining strong encryption seems to be exactly what the proposal intends to do.

In the run-up to the official proposal later this year, we urge all European Commissioners to remember their responsibilities to human rights, and to ensure that a proposal which threatens the very core of people’s right to privacy, and the cornerstone of democratic society, is not put forward.

Ella Jakubowska

Policy Advisor

Twitter: @ellajakubowska1