A beginner’s guide to EU rules on scanning private communications: Part 2

Vital EU rules on human rights and on due process protect all of us from unfair, arbitrary or discriminatory interference with our privacy by states and companies. As we await the European Commission’s proposal for a law which we fear may make it mandatory for online chat and email services to scan every person’s private messages all the time, which may constitute mass surveillance, this blog explores what rights-respecting investigations into child sexual abuse material (CSAM) should look like instead.

By EDRi · February 2, 2022

Content warning: discussion of child sexual abuse and exploitation

In 2021, the European Union (EU) institutions responsible for making laws agreed to pass the temporary derogation from certain provisions of the ePrivacy Directive. This new law allows certain companies to scan everyone’s private messages and chats, even though such practices may be incompatible with the EU’s human rights and data protection laws.

In the first part of this blog series, we explored how changes to definitions in the European Electronic Communications Code (EECC) led to a situation of panic, which may have enabled the European Commission to push through the temporary law despite so many concerns having been raised.

The temporary derogation will expire in August 2024, and the EU wants to replace it before then with a ‘long term’ version. The long-term proposal is currently scheduled for 30 March 2022, although its publication has already been pushed back several times, meaning that time is ticking. In this blog, we take a look at what could be coming up in the new proposal, and how the investigation of online CSAM should be done in order to meet the standards required by EU law.

The Commission’s plans for Chat Control:

European Commissioner Ylva Johansson, the Commissioner responsible for EU laws and policy on Migration and Home Affairs, has spoken repeatedly with the press to emphasise the hard line that she is taking. She recently met with almost a dozen US-based tech companies about her plans, warning them:

“I will propose to make it [the automated scanning of private communications] mandatory so you better shape up and start realising that this is going to happen”.

What the Commissioner is saying is that the new proposal will take even further-reaching steps than the voluntary scanning currently permitted by the short-term law (and which the European Parliament have pointed out might already be unlawful).

Given that Members of the European Parliament (MEPs) have warned that the short-term law lacks a legal basis and would probably be invalidated if it were taken to court, it is deeply concerning that the Commissioner wants to put forward rules which will take an even more extreme stance against the privacy of the EU’s 447 million inhabitants. There have even been fears that the proposal might seek to undermine encryption, which is a vital technology that we all rely on every day, for example to make online bank transactions, to communicate with our doctor and even for governments to protect intelligence.

The Commission’s plans have been coined “Chat Control” by Patrick Breyer, a human rights lawyer and MEP, because they seek to automatically scan the chats, messages and web-based emails of every person in the EU (including young people). In effect, this would mandate the surveillance and control of all our private communications by Big Tech companies like Facebook, enabled by proprietary scanning tools from Microsoft and other companies, which we are therefore unable to externally audit.

We fear that the Commission are poised to propose a ‘solution’ which lacks a legal basis, will compel corporations to use secretive technology to invade all of our messages and chats, may remove our choice to choose privacy-respecting message services, put our devices at an enhanced risk of hacking, and which may constitute mass surveillance.

Furthermore, for a complex and controversial law like this one – with potentially enormous consequences on people’s rights and liberties – it is important that the process of negotiating the proposal is not rushed. Unlike what happened with the temporary derogation, it is vital that MEPs are given ample time to perform their role of democratic scrutiny, and are not silenced from voicing concerns by accusations that they aren’t committed to protecting children.

Is there a rights-respecting way to investigate online CSAM?

Democracy and the rule of law are founded on rights such as good administration and the presumption of innocence, as well as principles such as accountability and due process. These rules underpin the proper functioning of our justice systems. They ensure that vital evidence can be admissible in court and that cases don’t fall apart because a suspect was mistreated. As a result, they make it more likely that justice can be achieved for victims. They also protect human rights defenders, government critics and journalists from reprisals and ensure that we can all speak freely.

When it comes to detecting, investigating and prosecuting online CSAM, it is no different. Those who view or disseminate online child sexual abuse or exploitation are committing an egregious crime, and must be investigated and prosecuted for this. To do this, law enforcement agencies should tackle online CSAM in the same way that they tackle any other case: receiving reports, following leads, singling out suspects, conducting investigations into those suspects and building up evidence in a lawful way.

In certain cases, they might apply for a court order to covertly intercept the phone calls, messages or letters of a suspect – which is acceptable (as long as they can justify that such a move is necessary, proportionate and lawful, of course). The investigation of serious crimes does not, however, mean that governments can take any measure at any cost. The sensitivity of the topic of CSAM cannot be used to silence voices that call for police investigations to be conducted in a necessary, proportionate and lawful way. Nor should the legal responsibility for the spread of such content be outsourced to service providers, making them responsible for content that is a matter for law enforcement agencies.

In a democratic society, law enforcement cannot cast a wide net of surveillance ‘just in case’ they might find a crime. Governments can intrude on people’s privacy only if they have a very good reason to do so, such as that person being individually suspected of a crime that justifies that particular intrusion. This is vital for protecting each and every one of us from state over-reach, arbitrary investigations, making sure that we are not unfairly targeted or discriminated against, and that decisions made by law enforcement can be investigated in the event that wrongdoing is alleged. This protects suspects, witnesses and victims, as well as the police officers themselves, by creating a paper trail for accountability purposes.

A holistic approach should also tackle the issue in its grave context: child sexual abuse and exploitation does not exist because of the digital technologies, even though the internet exacerbates its spread and the ease with which it can be created. In fact, in 2017, the European Parliament reported that countries across the EU have failed to implement a series of measures which were adopted in a 2011 Directive to tackle the issue of child sexual abuse and exploitation. If countries still systemically fail to follow existing measures, then additional laws are at best premature.

Given the lack of implementation of the 2011 Directive, and in the absence of a legal basis for the short-term derogation, will the Commission see sense and propose only lawful, targeted, open-source methods and techniques for investigating online CSAM?

Keep your eyes peeled for the third installment of this blog, where we will outline our 10 principles for scanning private communications in the EU in a way that respects fundamental rights. And mark the 30h March in your diary because – if the Commission continues to ignore civil society concerns – it could be the end of privacy in the EU as we know it.

(Contribution by:)

Ella Jakubowska

Policy Advisor

Twitter: @ellajakubowska1