EU Rules on Scanning Private Online Communications: Document Pool

This document pool contains analyses, updates and resources relating to EU rules on scanning private online communications, in particular the proposed 'Regulation laying down rules to prevent and combat child sexual abuse'

By EDRi · March 28, 2022

Content warning: this page and these resources contain discussions of child sexual abuse and exploitation

"Privacy and safety are mutually enforcing rights. People’s ability to communicate without unjustified intrusion - whether online or offline - is vital for their rights and freedoms, as well as for the development of vibrant and secure communities, civil society and industry."

EDRi and 51 other civil society organisations

In this document pool we list articles and documents related to EU rules on scanning private online communications, in particular the ‘Regulation laying down rules to prevent and combat child sexual abuse’ (2022/0155(COD)). This will allow you to follow the developments of measures and regulatory actions.

Contents

  1. Latest updates from EDRi
  2. EDRi’s position on the scanning of online private communications
  3. EDRi’s background to the file
  4. Civil society open letters
  5. Key dates and official documents
  6. Other useful resources
  7. Terminology
  8. Contact us

1. Latest updates from EDRi

3. EDRi’s background to the file

4. Civil society open letters and demands

5. Key dates and official documents

6. Other useful resources

On the importance of privacy, data protection and encryption for children and young people:

On the technical issues, cybersecurity risks and fundamental rights risks of generalised scanning tools and methods:

From the European Parliament:

7. Terminology

Chat control is a term used to refer to the EU’s approach to CSAM and the ePrivacy derogation(s). It is used because, purportedly to tackle online CSAM, the EU has adopted laws which allow for everybody’s supposedly private chats to be surveilled. Read more thanks to MEP Patrick Breyer here.

‘CSAM’ stands for ‘Child Sexual Abuse Material’. It’s a term used to refer to videos, photos, and sometimes written or audio content, which depict the sexual solicitation, abuse and exploitation of under-18s. Generally, CSAM is about the online sharing of such material, for example via messages, or in materials that are uploaded to a cloud server. EU laws and policies which address CSAM have thus tended to focus on tackling the online dimensions of the issue. Some countries refer to the online dimension specifically using the term ‘OCSEA material’ (Online Child Sexual Exploitation and Abuse material) but this is not yet a common term in EU policy debate.

‘CSEA’ stands for ‘Child Sexual Exploitation and Abuse’, and is sometimes also referred to as ‘CSE’ (Child Sexual Exploitation).

A derogation is an exception, passed via a legislative process, to carve out provisions of another law that will no longer apply in the specific context in which the derogation operates. The ‘Legislation to effectively tackle child sexual abuse’ will derogate from the EU’s ePrivacy Directive.

8. Contact us

Want to know more? Contact our advisors working on this file:

Ella Jakubowska

Policy Advisor

E-Mail: ella.jakubowska [at] edri [dot] org
PGP: 8B92 3E96 4E53 83F3 2F92 240A F53D 739E CFDA 2711
Twitter: @ellajakubowska1

An image of Diego Naranjo

Diego Naranjo

Head of Policy

E-Mail: diego.naranjo [at] edri [dot] org
PGP: 9A62 189E DB31 1798 6A8E FD45 E320 B10D 3493 8C21
Twitter: @DNBSevilla