Blogs | Privacy and data protection | Privacy and confidentiality | Surveillance and data retention

Is surveilling children really protecting them? Our concerns on the interim CSAM regulation

On 27 July, the European Commission published a Communication on an EU strategy for a more effective fight against child sexual abuse material (CSAM). The Communication indicated several worrying measures that could have devastating effects for your privacy online. The first of these measures is out now.

By EDRi · September 24, 2020

The Communication on CSAM brings forward a number of legislative measures that raised concerns among privacy activists. The first is an interim” (until 2025) Regulation that is intended to be adopted swiftly in a matter of weeks. What’s the rush and what will this Regulation cover?

Your communications will be more protected in December. Or not.

By the end of the year, the European Electronic Communications Code (EECC) will enter into effect and change the legal definition of an “electronic communications service” to include Over The Top (OTT) services such as WhatsApp, Instagram messaging, Facebook messenger, etc. This will automatically extend the scope of the ePrivacy Directive to cover OTT services. Therefore, all the provisions that protect privacy and confidentiality of communications will apply to them as well. This should be good news for the millions of people using these services who will see better protection of online communications but it has been perceived as a threat. In particular, current practices like companies doing “voluntarily” (pressured by national governments and the European Union) reading (scanning) of private communications to detect CSAM should stop after December 2020. By then Articles 5(1) and 6 of the ePrivacy Directive prohibiting “listening, tapping, storage or other kinds of interception or surveillance of communications and the related traffic data by persons other than users, without the consent of the users concerned” will apply to OTTs.

Crytpowars 3.0? What does the interim Regulation ask for

The interim Regulation proposes “targeted” measures to “well-established technologies regularly used by providers of number-independent interpersonal communications services (…) that are in accordance with the state of the art used in the industry and are the least privacy-intrusive”. If you are somewhat puzzled as to what this means for you, you’re not alone.

First, this draft interim Regulation will ensure that private companies can keep or start scanning private communications and look for all sort of heinous CSAM. It is unclear which “use of technologies for the processing of personal and other data” to detect CSAM is already happening, and under which legal basis (if any). The GDPR, which provides the current legal basis, does not necessarily allow indiscriminate scanning of private messages for illegal material. We would need to know much more from these companies and their specific practices before even starting a debate about allowing them. Julia Reda, former MEP who is currently working with EDRi member organisation GFF, reminds us that the “strategy paper (PDF) that the EU Commission published in July shows that it has a particular focus on Facebook, which offers a whole range of relevant services with Messenger, WhatsApp and Instagram direct messages”, but that other services could fall in the net between now and the time the interim Regulation is adopted.

Second, no impact assessment or public consultations have been done or will be done to understand the impact of these measures on fundamental rights. The “urgency” (lack of planning by the Commission and inaction by Member States legislating nationally) does not allow for a proper assessment of what the interim Regulation could lead to, and public debate on the issue seems to have been deemed unnecessary. The Commission has been aware of the extended scope of the ePrivacy Directive since December 2018 when Member States were informed at the Telecommunications Council meeting.

Third, even though the interim Regulation is limited in time until 2025, it remains difficult to imagine that, if the interim Regulation is adopted, the provisions allowing for scanning of communications will not find its way in the long-awaited ePrivacy Regulation. If it was already agreed upon, why not include it in the new text?

Fourth, the Regulation puts private companies in charge of a matter that should be taken care of by public authorities. Delegating the fight against CSAM from law enforcement authorities to the surveillance apparatus of Facebook and other big tech companies is irresponsible and a step backwards in the fight against surveillance capitalism that the General Data Protection Regulation (GDPR), the proposed ePrivacy Regulation and the future Digital Services Act and updated competition policies bring. It will embolden big tech (and governments) to the detriment of citizens. By relying on voluntary measures of private companies, the Commission completely sidesteps the public discussion of whether indiscriminate scanning of everyone’s private communication is a necessary and proportionate measure. Whether Facebook should or shouldn’t be allowed to surveil childrens’ private communications in an effort to protect them against grooming is a debate that needs to be held democratically. It cannot be accomplished by a rushed interim Regulation which is expected to be adopted in weeks.

Finally, this regulation sounds like just the beginning. The recently leaked internal discussion paper (PDF) entitled “Technical Solutions for Detecting Child Abuse in End-to-End Encrypted Communication”, already envisages the need to break end-to-end encryption as we know it, which will be unofficially launch the beginning of the next Crypto Wars.

At a minimum, the debate on this proposal needs to be informed by opinions from the European Data Protection Supervisor (EDPS) and the Fundamental Rights Agency (FRA). In addition a public debate including public consultations should be conducted as well as an adequate impact assessment on the proposal. Anything less will not necessarily protect children, but most likely make private communications for all, including children, subject to mass surveillance.

Read more:

Keep private communications private (10.09.2020): https://edri.org/our-work/keep-private-communications-private/

Fighting child sexual abuse: Commission proposes interim legislation to enable communications services to continue detecting child sexual abuse online (10.09.2020) : https://ec.europa.eu/digital-single-market/en/news/fighting-child-sexual-abuse-commission-proposes-interim-legislation-enable-communications

Communication: EU strategy for a more effective fight against child sexual abuse (24.07.2020): https://ec.europa.eu/home-affairs/sites/homeaffairs/files/what-we-do/policies/european-agenda-security/20200724_com-2020-607-commission-communication_en.pdf

Edit Policy: Die neuen Crypto-Wars (German) (14.09.2020): https://www.heise.de/meinung/Edit-Policy-Die-neuen-Crypto-Wars-4892651.html (German)

European Commission internal discussion document leaked by Politico: “Technical solutions to detect child sexual abuse in end-to-end encrypted communications”

https://www.politico.eu/wp-content/uploads/2020/09/SKM_C45820090717470-1_new.pdf

(Contribution by Diego Naranjo, Head of Policy, EDRi)