Blogs | Privacy and data protection | Privacy and confidentiality | Surveillance and data retention

Keep private communications private

On 27 July, the European Commission published a Communication on a EU strategy for a more effective fight against child sexual abuse material (CSAM). The Communication indicates that messaging services (WhatsApp, Facebook Messenger…) may see their privacy protections undermined under new legislation that will be proposed this week.

By EDRi · September 10, 2020

A second measure that the Commission proposes is that platforms will have to snoop on all private communication. Additionally, the Commission also risks hindering all forms of encryption. What could go wrong?

The Communication brings three different measures, all of them with different degrees of risk for privacy.

First, it proposes to allow the continuation of “voluntary practices” (from private companies) to detect child sexual abuse after December 2020. The new scope of the ePrivacy Directive as of December 2020 will highly protect so-called Over-The-Top services (OTTs) like Facebook Messenger, WhatsApp, Instagram messages, etc. for commercial or other purposes, unless the end-user has given their consent (e.g. scanning emails for computer viruses). The Commission fears that banning “voluntary” snooping of private communications may affect criminal investigations, despite evidence to the contrary (see below). Last, the Commission calls for “immediate action” with a “narrowly-defined targeted solution”. This immediate action is expected to be published this week.

Second, in 2021 the Commission will propose the “necessary legislation to tackle child sexual abuse online effectively including by requiring relevant online services providers to detect known child sexual abuse material and require them to report that material to public authorities”. Who these relevant services are and how this will work in practice remains to be seen, but the measures ring a bell to all those who have worked on the copyright Directive and in the Terrorist Content Online Regulation.

Third, the Communication clearly labels encryption as a threat: “The use of encryption technology for criminal purposes therefore needs to be immediately addressed through possible solutions which could allow companies to detect and report child sexual abuse in end-to-end encrypted electronic communications.” Whether this will be done by undermining encryption or allowing service providers to snoop on communications in apps before they are encrypted remains to be seen, but it is as vague as it is dangerous. The Communication on the EU Security Union Strategy, which has a much broader scope than CSAM, also mentions encryptions saying that “the Commission will explore and support balanced technical, operational and legal solutions to the challenges and promote an approach which both maintains the effectiveness of encryption in protecting privacy and security of communications, while providing an effective response to crime and terrorism.”

What does all of this means for privacy and confidentiality?

Despite the positive aim to protect children by preventing CSAM from being shared widely, the measures proposed in the Communication risk undermining privacy and confidentiality of communications for all. Regarding the alleged “solutions” (namely the “targeted”, the Platforms-snoop-all, or the break-encryption measures), the risks will vary depending on the actual measures. However, once measures that undermine privacy become part of the toolbox for law enforcement and security agencies, it becomes a slippery slope from CSAM to other criminal offences. Ultimately this could result in measures against content that is legal, but regarded as undesired or “harmful” by the State, e.g. “misinformation”. Communication scanning goes hand in hand with undermining privacy protections such as end-to-end encryption. It will thus become impossible for individuals to know the extent to which their private communications are being surveilled by companies acting on behalf of the State. This would have a massive impact on our freedom of expression and other fundamental rights.

Solutions at hand: We have already shared how law enforcement can do their jobs without breaking encryption. Our work, based on recommendations from others such as Schneier, brings concrete ideas and solutions to problems that law enforcement agencies and intelligence services may face. Our solutions are targeted, whereas the initiatives proposed by the Commission are general and indiscriminate, imposing needless surveillance against the entire population. Today, we are well aware that once you are targeted by security services or law enforcement agencies, little can be done to counter that. It is therefore important to explore dealing with criminality online in a way that allows for encryption or other means of obfuscating identity, before attacking the general population’s ability to use end-to-end encryption and to communicate privately.

We will closely monitor the initiatives being proposed and continue to address any potential threats to encryption or privacy of communications. Privacy and confidentiality of communications are not absolute rights, but this shouldn’t mean that they can be dismantled in our online communications routinely and without safeguards. In the end, our goal is not to be put at risk of being snooped on by national and foreign security services, intelligence services or private companies in return for the perceived protection of children’s rights.

Read more:

Communication on a EU strategy for a more effective fight against child sexual abuse (24.07.2020) :

Interim Regulation to ensure that providers of online communications services can continue detecting and reporting child sexual abuse online and removing child sexual abuse material (10.09.2020):

Leak: Technical solutions to detect child sexual abuse material in end-to-end encrypted communications:

EDRi encryption position paper:

(Contribution by Diego Naranjo, Head of Policy from EDRi and Jesper Lund, from EDRi member IT-Pol Denmark)