A beginner’s guide to EU rules on scanning private communications: Part 1

In July 2021, the European Parliament and EU Council agreed temporary rules to allow webmail and messenger services to scan everyone’s private online communications. In 2022, the European Commission will propose a long-term version of these rules. In the first installment of this EDRi blog series on online ‘CSAM’ detection, we explore the history of the file, and why it is relevant for everyone’s digital rights.

By EDRi · December 15, 2021

Content warning: discussion of child sexual abuse and exploitation

In July 2021, European Union (EU) member state ministers reached an agreement with the European Parliament to pass a new law, creating a temporary exception (derogation) from certain parts of the 2002 ePrivacy Directive. This derogation allows electronic communications services, like chat or webmail services, to conduct the automated scanning of everyone’s private communications, all of the time, instead of limiting surveillance to genuine suspects and in line with due process. Such generalised scanning practices can constitute a form of mass surveillance. They pose a serious risk to everyone’s fundamental rights because they treat each one of us as suspicious.

The derogation will expire in August 2024, and the European Commission intends to replace it with a long-term version which they will put forward in 2022. The purported goal of these derogations is to allow companies to detect online ‘CSAM’ (child sexual abuse material). Yet the temporary derogation allows companies to conduct the mass scanning of everybody’s private messages and chats, instead of limiting surveillance to those against whom there is reasonable, lawful suspicion. Worse still, the Commission has indicated that the long-term law might make generalised scanning of everyone’s personal communications mandatory. If passed, such sweeping and disproportionately-invasive measures would likely do far more harm than good.

As child abuse survivor and privacy advocate Alexander Hanff has pointed out, the issue has been instrumentalised by companies pushing their lucrative technology, and by politicians making a “knee jerk reaction”. Hanff continues that politicians have proposed overly broad measures without considering the severe risks of their proposals to the very survivors they purport to be protecting, nor the negative impact on wider society.

EDRi’s goal for this file, therefore, is to make sure that any proposals to detect online CSAM are in line with the EU’s fundamental rights obligations, in particular that measures are lawful, targeted, as well as clearly and objectively proportionate to their stated goal. In order to better understand what’s going on today, let’s take a look back at some relevant pieces of EU legislation:

2002: the ePrivacy Directive

The ePrivacy Directive is the EU’s only instrument containing specific protections for everyone’s right to a private life and confidentiality of communications, as enshrined in Article 7 of the EU Charter of Fundamental Rights. It was supposed to be updated in a new ePrivacy Regulation in 2017, as a counterpart to the better-known General Data Protection Regulation (GDPR). But (perhaps ironically, given how fast the derogation was passed) ePrivacy has been in limbo ever since 2017, with lawmakers unable to agree how best to update its rules to today’s digital environment.

The adoption of the ePrivacy Directive in 2002 was an important milestone in preventing the surveillance of all our private communications without a legitimate, lawful reason and proper safeguards. The Directive was passed by EU lawmakers in recognition of the vital need to protect our privacy. As UN High Commissioner for Human Rights, Michelle Bachelet, explains:

“the right to privacy plays a pivotal role in the balance of power between the State and the individual and is a foundational right for a democratic society”.

The ePrivacy Directive is so important because it recognises that privacy, and the confidentiality of communications, are not abstract values. Rather, privacy is a right that underpins each of us being able to vote freely in elections, to expose corruption and abuses of power, to access healthcare, and to speak freely with our friends and family. Confidentiality is what enables journalists, doctors, lawyers and human rights defenders to do their work.

2011: the CSEA Directive

Under a different policy stream, in December 2011, the co-legislative bodies of the EU officially adopted the Child Sexual Exploitation and Abuse (CSEA) Directive. Addressing the online storage, dissemination and amplification of CSAM is a serious and sombre topic, underpinned by complex societal and criminal issues. The European Commission and the Council of Europe emphasise that the EU has a legal obligation to protect young people from abuse. Yet as noted in a recent independent report commissioned by the Council of Europe, the CSEA Directive, which was the EU’s first major legal effort to tackle online CSAM, has not been a success.

Only half of EU Member States have adopted crucial measures to ensure that web pages containing or disseminating CSAM are promptly removed (Article 25, CSEA Directive). Furthermore, a recent investigation in Germany has given a chilling example of the lack of action to remove CSAM even after authorities have been alerted to its existence. Journalists revealed vast numbers of years-old photos and videos of child abuse which have remained online for many years after being reported, because German police do not have the “human resources” to take down this illegal content.

If European police forces don’t have the basic resources to remove traumatic CSAM that is already out there – and which hotlines like INHOPE urge must be swiftly removed as a top priority in order to avoid re-victimising survivors – then how do they plan to deal with a sharp increase in the volume of online CSAM that will be detected as a result of widening scanning practices? Furthermore, we must ask why – given the problems with the CSEA Directive – the European Commission chose not to prioritise enforcing the removal of CSAM even once authorities have been notified, and instead chose, via the derogation, to focus on surveilling every person’s personal communications?

2018-2020: the new scope of the European Electronic Communications Code

Fast forward to 2018, where the ‘recast’ of a separate law, the European Electronic Communications Code (EECC), expanded the definition of an ‘electronic communications service’. This expansion meant that from December 2020, certain rules in the ePrivacy Directive now applied to services like Facebook messenger and WhatsApp, to web-based email services, and even the chats in dating apps.

This should have been a good thing: these changes to the EECC stopped private companies that provide electronic communications services from being able to snoop on everybody’s private conversations. Just like the police can’t tap all of our phones or open our mail unless they have good reason to suspect us of a crime grave enough to justify such an interference, neither should corporations be allowed to tap everyone’s messages, all of the time.

2020-21: the temporary ePrivacy derogation

Yet as the application of the expanded scope of the EECC in December 2020 drew closer, the European Commission faced significant pressure from the companies that profit from selling automated scanning tools to carve out an exception to the ePrivacy Directive’s rules. The European Commission wanted to push through a temporary derogation to allow companies to continue to voluntarily scan everyone’s private communications before the end of 2020. This would be followed by a longer-term deviation from the ePrivacy Directive’s protections against mass snooping. Even celebrities weighed in, with Ashton Kutcher and Demi Moore reportedly advising the Commission to hurry up.

However, as the EU’s own Data Protection Supervisor has warned, the European Commission’s proposal severely lacked proper engagement with evidence, proportionality, and legality, which are some of the cornerstones of the EU’s human rights regime. The proposal was also criticised by the Council of Europe, by digital rights groups like EDRi, and by members of the European Parliament (MEPs), who asked for more time to strengthen the safeguards in the Commission’s proposal.

The Commission justified their haste by saying that ePrivacy had created an emergency because companies could no longer scan people’s private messages for CSAM. This came as a big surprise to us: up until this point, there had been no public information that such scanning of all our private messages (yes, yours included) was even happening! You can read more about some of the (many) issues with the temporary derogation – which was passed in July 2021, and entered into force in August 2021– in EDRi’s analysis here. MEPs likened the process to “moral blackmail”, pointing out that the practices outlined in the temporary derogation may not be lawful under the GDPR, and would likely be invalidated if taken to court.

2022: awaiting the long-term ePrivacy derogation

Now that we’ve looked at the history of online CSAM measures in EU legislation, watch out for the next installment of this series. There, we will consider the key issues surrounding the long-term ePrivacy derogation, currently expected on 8 March 2022, and how to ensure that all EU measures to tackle online CSAM are compliant with fundamental rights rules.

  • Chat control: chat control is a term used to refer to the EU’s approach to CSAM and the ePrivacy derogation(s). It is used because, purportedly to tackle online CSAM, the EU has adopted laws which allow for everybody’s supposedly private chats to be surveilled. Read more thanks to MEP Patrick Breyer here.
  • CSAM: ‘CSAM’ stands for ‘Child Sexual Abuse Material’. It’s a term used to refer to videos, photos, and sometimes written or audio content, which depict the sexual solicitation, abuse and exploitation of under-18s. Generally, CSAM is about the online sharing of such material, for example via messages, or in materials that are uploaded to a cloud server. EU laws and policies which address CSAM have thus tended to focus on tackling the online dimensions of the issue.
  • Derogation: a derogation is an exception, passed via a legislative process, to carve out provisions of another law that will no longer apply in the specific context in which the derogation operates.
  • Recast: the European Commission says that recasting ‘brings together in a single new act a legislative act and all the amendments made to it. The new act passes through the full legislative process and repeals all the acts being recast. […] [R]ecasting involves new substantive changes, as amendments are made to the original act during preparation of the recast text’. Read more on the Commission’s site here.

(Contribution by:)

Ella Jakubowska

Policy Advisor

Twitter: @ellajakubowska1