Is this the most criticised draft EU law of all time?
An unprecedentedly broad range of stakeholders have raised concerns that despite its important aims, the measures proposed in the draft EU Child Sexual Abuse Regulation are fundamentally incompatible with human rights.
The proposed EU ‘Regulation laying down rules to prevent and combat child sexual abuse‘ (2022) (CSA Regulation, or CSAR) has raised concerns that it is incompatible with EU fundamental rights and case law – perhaps more so than any other EU law in recent memory.
Whilst all stakeholders agree on the importance of the aim to protect children, all formal legal and technical assessments have concluded that the proposed measures could amount to disproportionate violations of everyone’s privacy, personal data and free expression online, and rely on technically infeasible or dangerous measures.
Read on to see how a wide range of stakeholders, including child protection experts, survivors of CSA, police, national governments, UN officials, companies, NGOs and others have warned that the proposed measures are misguided and could do more harm than good.
"[The CSA Regulation] could become the basis for de facto generalised and indiscriminate scanning of the content of virtually all types of electronic communications of all users in the EU/EEA"
EU institutions and case law
The Court of Justice of the EU enforces strong protections for privacy, data protection, free expression, non-discrimination and other fundamental rights both online and off, for children and adults alike, on the basis of the Charter of Fundamental Rights. The serious and widespread threat posed by the draft CSAR to fundamental rights is reflected in all formal legal analyses from EU institutions about the proposal.
- The European Commission‘s own impact assessment (SWD(2022)209) to the CSAR recognises that there are no scanning methods that have good levels of privacy, security and feasibility. Available via local download at: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Child-sexual-abuse-online-detection-removal-and-reporting_en
- The European Commission‘s regulatory scrutiny board (RSB) warned before the law was officially proposed that it might amount to unlawful general monitoring: https://edri.org/our-work/leaked-opinion-of-the-commission-sets-off-alarm-bells-for-mass-surveillance-of-private-communications/
- The EU Council‘s official Legal Service Opinion – in a rarely-seen major critique of a legislative proposal – warned of a “serious risk” of generalised monitoring, undermining encryption and violating the very essence of the right to privacy. Description available here: https://edri.org/our-work/despite-warning-from-lawyers-eu-governments-push-for-mass-surveillance-of-our-digital-private-lives/ and full document published at: https://www.bitsoffreedom.nl/wp-content/uploads/2023/05/20230426-opinion-legal-services-on-csar-proposal.pdf
- The European Data Protection Supervisor (EDPS) and Board (EDPB) warned that the proposal will severely harm innocent people with little to no evidence that it will protect children: https://edpb.europa.eu/our-work-tools/our-documents/edpbedps-joint-opinion/edpb-edps-joint-opinion-042022-proposal_en
- The independent impact assessment undertaken on behalf of the European Parliament‘s civil liberties (LIBE) committee assessed the balance between safeguarding the privacy of users in general, and children’s rights. It found that the CSAR lacks evidence of effectiveness and could not justify the serious extent to which it violate Articles 7 and 8 (privacy and personal data rights) of the EU Charter of Fundamental Rights: https://www.europarl.europa.eu/RegData/etudes/STUD/2023/740248/EPRS_STU(2023)740248_EN.pdf
- Furthermore, the European Commission has admitted that the proposal is based on claims of technical accuracy directly from suppliers, which have never been independently verified: https://www.asktheeu.org/en/request/technologies_for_the_detection_o
International child rights law requires that children’s views are incorporated into laws which relate to their rights and safety, in order to respect their autonomy.
- According to a large representative survey, 80% of children in the EU say that they would not feel comfortable and safe being politically active or exploring their sexuality if authorities were able to monitor their digital communications on the basis of finding child abuse material: https://app.episto.fr/surveys/3bda4c61a861/charts?lg=kSSu
- The same survey shows that around two-thirds of young people in the EU rely on encrypted message services for communication, and around the same number disagree or totally disagree with the premise that providers should be allowed to scan their private chats. Instead, improving media literacy and reporting mechanisms are overwhelmingly favoured by the children surveyed.
- Young activists warn of being suppressed by the CSAR: https://edri.org/our-work/internal-markets-meps-wrestle-with-how-to-fix-commissions-csar-proposal/
Child sexual abuse survivors
- CSA survivor Alexander Hanff warns that the CSAR will discourage survivors from seeking support: https://www.euractiv.com/section/data-protection/news/new-eu-law-allows-screening-of-online-messages-to-detect-child-abuse/
- Survivors representative group MOGiS e.V. warn that the proposal will harm survivors and other young people: https://twitter.com/echo_pbreyer/status/1637847224641110020?t=hQUdyzuODWACsxr4IGS5Mg&s=19
- Deputy Chairperson of MOGiS e.V. spoke to lead MEPs working on the CSAR as a representative of CSA survivors, in a speeach to alert them to how the law will do more harm than good for survivors of CSA, and how the Commission’s proposed approach is “fundamentally flawed”: https://edri.org/wp-content/uploads/2023/08/EU-LIBE-MOGiS-Hahne-07032023_EN.pdf
- Survivor of sexual violence, Marcel Schneider, is currently suing Facebook for the automated scanning of private messages, removing confidentiality for victims of abuse and criminal defendants: https://freiheitsrechte.org/en/ueber-die-gff/presse/pressemitteilungen-der-gesellschaft-fur-freiheitsrechte/pr_chatcontrol_facebook
- As pointed out by MEP Patrick Breyer, several survivors of CSA have also provided feedback to the Commission’s CSAR proposal, criticising it for harming survivors: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online/F3282960_en (in EN) and https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Bekampfung-des-sexuellen-Missbrauchs-von-Kindern-Erkennung-Entfernung-und-Meldung-illegaler- Online-Inhalte/F3264100_de (in DE)
- Weisser Ring, a German CSA victim support group, and the Portuguese Association for Victim Support (APAV) collectively warn that despite important aims, the proposal could harm confidential communications, putting people at risk of blackmail and fraud, will lead to false positives, and has a risk of mass surveillance whilst lacking evidence of effectiveness: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online/F3338505_en
Child rights & child protection experts
- Expert in online child protection issues, Dr Sabine K Witting, has warned that the proposed grooming detection would criminalise self-expression between consenting adolescents: https://netzpolitik.org/2023/csam-verordnung-chatkontrolle-verletzt-sexuelle-selbstbestimmung-von-jugendlichen/
- The German child protection hotline, eco, calls the proposed measures in the CSAR “fundamentally questionable”: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online/F3338370_en
- The Dutch child protection hotline, EOKM (now known as Offlimits), has frequently warned that the generalised scanning measures will likely be harmful and ineffective (e.g. https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online/public-consultation_en). Founder of EOKM, Arda Gerkens, warns that the CSAR proposal “goes too far”: https://www.wired.co.uk/article/eu-child-abuse-law
- The German child protection association, the Kinderschutzbund, has warned that the proposed measures are disproportionate and likely ineffective: https://kinderschutzbund.de/wp-content/uploads/2023/02/Statement-for-the-public-hearing-on-chat-control-on-March-1-2023_DKSB.pdf
Police and prosecutors
- Public prosecutors in Germany have warned that the CSAR will not help them to find perpetrators, as it does not tackle the actual issues that they face: https://www.bundestag.de/ausschuesse/a23_digitales/Anhoerungen/932296-932296
- Police officers specialised in child protection in Germany have warned that the proposed measures will not help find more perpetrators, only more false alarms: https://edri.org/wp-content/uploads/2023/05/CSAR-summary-booklet.pdf
- A senior police officer in the Netherlands warned that the police will be unable to deal with the volume of reports predicted as a result of the proposal: https://edri.org/wp-content/uploads/2023/05/CSAR-summary-booklet.pdf
- The FBI warned Members of the European Parliament from the LIBE committee in a mission to Washington in May 2023 that “already now they [the FBI] do not have sufficient resources to deal with all CSAM cases they detect and they need to prioritize some of them. From this perspective, mass scanning of communication would not result in an increase in law enforcement”: https://www.europarl.europa.eu/doceo/document/LIBE-CR-750050_EN.pdf
- At the time of writing, 465 scientists and academics in cybersecurity, privacy and related fields have warned that the proposed measures are dangerous and untenable: https://docs.google.com/document/d/13Aeex72MtFBjKhExRTooVMWN9TC-pbH-5LEaAbMF91Y/edit
- 47 digital human rights NGOs in the EDRi network have warned that the proposal will jeopardise everyone’s online privacy and free expression and lacks evidence that it will achieve its goals, and therefore must be rejected: https://edri.org/wp-content/uploads/2023/05/CSAR-summary-booklet.pdf
- Feminist technology researchers warn that the CSAR is a paternalistic and disempowering approach to gender-based violence: https://feministtechpolicy.org/en/case-studies/csa-regulation/
- Public interest technologists in Europe warned against the CSAR’s proposed measures: https://www.politico.eu/wp-content/uploads/2023/05/10/Experts-letter-encryption-CSA.pdf
- The Council of European Professional Informatics Societies (CEPIS) warned about the threat to encryption: https://cepis.org/app/uploads/2022/03/Open-letter-on-a-right-for-encryption-CEPIS-2022-nosig.pdf
- Researchers at TU Delft warn that the technological claims on which the CSAR is based are misleading or false: https://www.euractiv.com/section/platforms/news/fact-checkers-call-out-commission-on-anti-child-abuse-material-proposal/
- Professor Matthew Green warns about the risk of poisoning images in order to manipulate scanning technologies, among other serious concerns: https://blog.cryptographyengineering.com/2023/03/23/remarks-on-chat-control/
- President of the Signal Foundation, Meredith Whittaker, warned of the major risk of mass surveillance: https://edri.org/wp-content/uploads/2023/04/EDRi20anniv_Meredith_Whittaker.pdf
- 14 eminent scholars warn of “bugs in our pockets”, or the dangers of client-side scanning, a method proposed by the European Commission in order to execute the CSAR’s proposed detection orders. The academics explain that these methods are ineffective, with a high potential for misuse or abuse: https://arxiv.org/abs/2110.07450
- Professor of Cyber Security and Policy, Dr. Susan Landau, has warned that “there is simply no technology” to do what the EU’s CSAR wants to do: https://progressivepost.eu/the-eus-dangerous-proposal-for-stopping-online-child-sexual-abuse-material/
- Edward Snowden warns that the CSAR is an “authoritarian” move which would restrict “basic human liberty”: https://twitter.com/Snowden/status/1662103652549423105
- Security expert Alec Muffet calls the proposal a “war on end-to-end encryption”: https://twitter.com/AlecMuffett/status/1524315613421879297
- Professor of Security Engineering, Ross Anderson, outlines not just the technical issues with the CSA Regulation proposal, but also its failing to engage with broader social issues that lead to CSA, as well as other strategic concerns relating to the development and sale of surveillance technologies: https://arxiv.org/abs/2210.08958. Professor Anderson further highlights researching showing that centralised takedowns, as proposed in the CSA Regulation, can add weeks to the time it takes to remove CSA from the internet: https://www.lightbluetouchpaper.org/2022/05/11/european-commission-prefers-breaking-privacy-to-protecting-kids/
Independent legal analyses
- A legal assessment from the University of Amsterdam’s Institute for Information Law (IViR) confirmed the threat to encryption: https://www.ivir.nl/publicaties/download/CSAMreport.pdf
- Director of CDT, Iverna McGowan, a former Senior Advisor to the UN Office of the High Commissioner for Human Rights, warns that the proposed Detection Orders are based on a flawed logic which would undermine procedural rights and the presumption of innocence: https://www.euractiv.com/section/law-enforcement/opinion/guilty-until-proven-innocent-the-flawed-logic-of-the-eus-proposed-child-safety-law/
Assistant Professors at Leiden University, Center for Law and Digital Technology (eLaw), Dr Witting and Dr Leiser, warn that the proposal likely violates the EU prohibition of general monitoring, and its content moderation measures could put children at risk of harm: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online/F3337766_en
- Former EU Court of Justice judge, Professor Dr. Ninon Colneric, published an assessment that filtering for CSAM in a generalised and indiscriminate manner would be incompatible with EU case law: https://www.patrick-breyer.de/wp-content/uploads/2021/03/Legal-Opinion-Screening-for-child-pornography-2021-03-04.pdf
- Legal researcher at KU Leuven Center for IT & IP law, Charlotte Somers, calls the proposal “an attack on Europeans’ privacy and data protection” and notes the high likelihood of CJEU annulment: https://www.law.kuleuven.be/citip/blog/the-proposed-csam-regulation-trampling-privacy-in-the-fight-against-child-sexual-abuse/
National governments and parliaments
- 🇵🇱 The government of Poland has slammed the CSAR for the risk of mass surveillance: https://www.euractiv.com/section/law-enforcement/interview/poland-slams-child-sexual-abuse-material-regulation-as-unnecessary/
- 🇦🇹 The Austrian Parliament has agreed to reject the CSAR unless it can guarantee the protection of encryption, contains no general monitoring obligation and respects human rights. All parties in the Austrian parliament voted against the CSAR proposal: https://en.epicenter.works/content/chat-control-a-good-day-for-privacy
- 🇪🇪 🇫🇮 🇲🇹 🇮🇹 🇳🇱 Governments of Estonia, Finland, Malta, Italy and the Netherlands have all raised serious problems: https://www.documentcloud.org/documents/23819681-law-enforcement-working-party-document-encryption
- 🇸🇪 The government of Sweden attempted to fully protect end-to-end encryption from the proposed measures and to stop general monitoring: https://edri.org/our-work/despite-warning-from-lawyers-eu-governments-push-for-mass-surveillance-of-our-digital-private-lives/
- 🇫🇷 A resolution from the French senate warned about risks with many parts of the proposal, especially the search for “unknown” material and grooming: https://edri.org/our-work/irish-and-french-parliamentarians-sound-the-alarm-about-eus-csa-regulation/
- 🇮🇪 The Irish Parliament’s justice committee raised serious concerns about the CSAR: https://edri.org/our-work/irish-and-french-parliamentarians-sound-the-alarm-about-eus-csa-regulation/
- 🇩🇪 The Digital committee of the German Parliament hosted experts including police and child protection experts, and found unanimous criticism of the CSAR. This led to the committee raising their concerns: https://www.bundestag.de/dokumente/textarchiv/2023/kw09-pa-digitales-928540
- 🇩🇪 The legal service of the German Parliament regards the proposal’s measures as generalised monitoring and thus violating the Charter of Fundamental Rights. It furthermore warns that the proposed measures might be adopted in other areas (i.e. scope creep): https://www.bundestag.de/resource/blob/914580/9eba1ff3a5daa7708fca92e3184a1ae3/WD-10-026-22-pdf-data.pdf
- 🇦🇹 🇩🇪 The governments of Austria and Germany have committed to fully protecting end-to-end encryption from this regulation: https://netzpolitik.org/2023/staendige-vertreter-eu-staaten-wollen-chatkontrolle-trotz-warnung-ihrer-juristen/#2023-05-31_St%C3%A4V_AStV_CSA-VO
- 🇨🇿 The human rights committee of the Czech Parliament raised serious human rights concerns, especially regarding the protection of encryption: https://www.psp.cz/sqw/text/orig2.sqw?idd=216063&pdf=1
- 🇳🇱 MPs in the Dutch Parliament adopted a motion calling on their government not to allow a CSAR to go ahead which would – as per the draft – violate end-to-end encryption and violate the right to confidentiality of communications: https://www.tweedekamer.nl/kamerstukken/moties/detail?id=2023Z07239&did=2023D17019
The European Parliament
- 14 MEPs from the lead Parliamentary committee (Civil Liberties, or LIBE), coming from 4 political groups, demand rejection of the CSAR: https://edri.org/our-work/civil-liberties-meps-warn-against-undermining-or-circumventing-encryption-in-csar/
- They are joined by many MEPs in the supporting IMCO (Internal Markets) committee who criticised the proposal’s scope, the threat its measures propose to encryption, and the serious risks posed by Detection Orders, with Renew group MEPs Hanh and Körner proposing full deletion of these parts : https://edri.org/our-work/internal-markets-meps-wrestle-with-how-to-fix-commissions-csar-proposal/
- MEP David Lega (EPP), co-chair of the European Parliament’s Intergroup on Children’s Rights, speaks out against the CSAR proposal: https://samnytt.se/fler-partier-ansluter-sig-till-sds-chat-control-motstand/
- The aforementioned complementary impact assessment to the CSAR on behalf of the LIBE committee further confirmed the lack of proportionality and fundamental rights compliance of the proposal: https://www.europarl.europa.eu/RegData/etudes/STUD/2023/740248/EPRS_STU(2023)740248_EN.pdf
- Microsoft warned that the claimed 88% accuracy figure for grooming detection that the European Commission put forward is not reliable (see especially annex 9, pages 285 – 315): https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12726-Fighting-child-sexual-abuse-detection-removal-and-reporting-of-illegal-content-online/F3338552_en
- An article from PhotoDNA user Cloudflare shows the ease of generating false alerts, despite claims from PhotoDNA developer Dr Hany Farid (which the European Commission has relied on in the draft CSAR) that this is as rare as 1 in 50 billion: https://blog.cloudflare.com/the-csam-scanning-tool/
- 9 European industry associations representing members like Google, Meta, Microsoft and Mozilla have voiced their concerns: https://ccianet.org/news/2023/04/csa-industry-associations-suggest-improvements-to-eu-regulation-in-joint-statement/
- LinkedIn reports a false-positive rate of 59% for its current voluntary use of PhotoDNA, which strongly contradics the accuracy of PhotoDNA often cited by the European Commission and other proponents of the CSAR: https://www.linkedin.com/help/linkedin/answer/a1347128
- Swedish VPN company Mullvad VPN explains the technological issues with the CSAR: https://mullvad.net/fr/blog/2023/3/28/the-european-commission-does-not-understand-what-is-written-in-its-own-chat-control-bill/
- ProtonMail CEO explains that they are concerned about the CSAR and the threat it poses to encryption and privacy: https://www.wired.co.uk/article/encryption-faces-an-existential-threat-in-europe
The United Nations
- The UN High Commissioner for Human Rights warns of generalised surveillance risk of CSAR: https://www.ohchr.org/en/press-releases/2022/09/spyware-and-surveillance-threats-privacy-and-human-rights-growing-un-report
Professional secrecy associations (journalists, lawyers, etc.)
- The two biggest Austrian journalistic associations have warned about the CSAR and the risks it would undermine the constitutional protections for journalistic freedom and protection of sources: https://www.derstandard.at/story/2000135631586/chatkontrollevoez-will-gesetzgebungsprozess
- The Austrian lawyers association has assessed that the proposal undermines fundamental rights and warns against it: https://www.youtube.com/watch?v=bU1dy0dcKYs
- The German Bar Association (DAV) Committee on Surveillance published a position warning that whilst they support the aim of the draft regulation; they criticise the proposed automated mass analysis of communication (content) data: https://dav-international.eu/en/newsroom/position-paper-32-23-csar-2
- 134 NGOs across a wide range of human rights, digital rights, children’s rights, women’s rights and online safety have called on the EU to reject the CSAR: https://edri.org/our-work/european-commission-must-uphold-privacy-security-and-free-expression-by-withdrawing-new-law/
- Belgian NGOs have questioned whether the Belgian government is failing to protect human rights by not opposing generalised surveillance in the CSAR negotiations: https://edri.org/our-work/the-belgian-government-is-failing-to-consider-human-rights-in-csa-regulation/
- 11,000+ Europeans ask the EU to reconsider this dangerous and misguided law: https://edri.org/our-work/children-deserve-a-secure-and-safe-internet/
Read more about the CSAR:
Download our full position paper: 'A Safe Internet for All: Upholding Private and Secure Communications' (1.3MB pdf)
EDRi's assessment based on relevant case law from the Court of Justice of the European Union is that the CJEU would very likely find the proposed measures in the CSA Regulation unlawful.
Read our short guide about the key problems with the CSAR proposal (age verification, threat to encryption, mass surveillance).
Some of the companies or individuals developing and/or selling scanning tech have made dubious claims about the accuracy and safety of their tech. Read our assessment and debunking of some of the most pervasive false claims!
EDRi’s document pool rounds up key resources and developments about the file, especially as they relate to the European Commission, European Parliament and EU Council.
Is there an important critique or concern missing from this page? Perhaps your NGO, group or association has published a statement? Let us know:
- Tweet us @edri
- Toot us on mastodon at
- Email our advisors working on the file at ella.jakubowska [at] edri [dot] org and diego.naranjo [at] edri [dot] org