Be scanned – or get banned!
In the latest in a string of alarming developments, the Belgian government has proposed a new supposed 'solution' to the Chat Control deadlock in the Council. Read why this new proposal undermines people's security across the European Union.
In the latest in a string of alarming developments, the Belgian government has proposed a new supposed ‘solution’ to the Chat Control (Child Sexual Abuse Regulation) deadlock in the Council. Providers of private communications services must ask people to consent to faulty AI-based scanning of their private chats, they suggest – or be banned from sharing images, videos and URLs!
Belgium – currently holding the unenviable role of trying to broker a compromise between European Union (EU) Member States – claimed earlier this year that they had found a “more proportionate” way forward on this controversial law. Governments of countries including Germany, Austria, France, Poland, the Netherlands, Italy, Estonia and Finland have reportedly raised concerns throughout the process. However, they are now under even more extreme pressure to reach a Council deal on the CSA Regulation, even if they have to give up on digital rights and technical reality in the process.
Now, a proposed Council text dated 28 May (9093/24) reveals a harshly different reality from Belgium’s promise of a human rights-respecting CSA Regulation. This new draft text carries over the many issues we’ve recently warned about (such as a shocking misunderstanding of statistics when images are wrongly flagged, a failure to respect vital end-to-end encryption, and a risk categorisation methodology that punishes secure and privacy-respecting services whilst rewarding those that systematically violate people’s privacy). It also adds two new major problems in addition:
1. Forced consent
The premise underpinning this new Belgian approach seems to be that if people consent to scanning, then it isn’t mass surveillance. This is a severe misunderstanding not only of mass surveillance, but also of consent. Consent to data processing, under EU law, needs to be given freely. It cannot be forced under the threat that without giving such consent, you will no longer be able to share pictures with your family, send links to your colleagues, and much more. Big Tech platforms have for years coerced users into accepting unlawful data processing practices as a condition for using the service. With the latest text, EU governments will literally become copycats of Big Tech’s abusive practices and dark patterns.
There’s the additional problem that it’s hard to see how such a mechanism could be effective. Those looking to distribute child sexual abuse material (CSAM) will simply refuse consent to be scanned, and will instead move to another service when a detection order forces the service provider to restrict the sharing functionality for non-consenting users. People who remain on the service, however, will still be caught in the drag-net of flawed AI surveillance – from teenagers whose consensual sexual imagery will be flagged and reported, to those whose content is mistakenly flagged as CSAM.
2. A rose by any other name…
This new Belgian approach has been coined “upload moderation”. However, despite a nicer-sounding name, what’s on the table would still be the same thing: the mass scanning of the private communications of people who are not suspected of any crime, even in end-to-end encrypted environments. Technology and cybersecurity experts have repeatedly warned that this cannot be done safely and securely – putting at risk the private communications of activists, journalists, young people, businesses and even governments!
Worse still, the Belgian proposal claims that end-to-end encryption is not compromised if the “upload moderation” (scanning) is done before the message is encrypted. However, this essentially demands the use of client-side scanning – a form of spyware on end-user devices. Cybersecurity experts have pointed out that client-side scanning will undermine the security of private communications and leave people vulnerable to a number of attacks by malicious actors:
“CSS by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which client-side scanning can fail, can be evaded, and can be abused.” – Bugs in Our Pockets, Abelson et al (2024)
Given the risk categorisation approach proposed by the Belgian government, which shockingly posits more secure and more privacy-respecting services as more dangerous, most end-to-end encrypted communication services are likely to be hit with a detection order. This will critically undermine the security of the service – which would effectively be the end of secure end-to-end encrypted communications in Europe.
There’s a reason why we’ve always said the EU needs to take a different approach. It’s because there is no way to mass scan and report on people’s private communications without violating the core of their right to privacy and creating massive security vulnerabilities. The search for a magic technical solution is doomed to fail because there is no magic solution to the serious problem of child sexual abuse – which instead requires multi-faceted societal-level and political solutions.
If EU Member States take the step of agreeing to Belgium’s new approach, they will effectively ban the sharing of pictures, videos and links in the EU by only “allowing” this type of content in private communications under mass surveillance. It might sound like an episode of Black Mirror – but unless governments stand up for our rights and scientific evidence, it could be coming to all our devices soon.
Read more
-
Position paper: A safe internet for all – Upholding private and secure communications
Despite the importance of its goals, the European Union’s proposed Child Sexual Abuse Regulation (CSAR) will not only fail in its aims to protect young people, but it...
-
CSA Regulation Document Pool
This document pool contains updates and resources on the EU's proposed 'Regulation laying down rules to prevent and combat child sexual abuse' (CSA Regulation)
-
Is this the most criticised draft EU law of all time?
An unprecedentedly broad range of stakeholders have raised concerns that despite its important aims, the measures proposed in the draft EU Child Sexual Abuse Regulation are fundamentally incompatible...
-
Open letter: Mass surveillance and undermining encryption still on table in EU Council
Today, 17 April, EDRi, in a coalition with 50 civil society organisations and 26 individual experts, call on Member State representatives not to agree to the proposed EU...
-
CSAR: European Parliament rejects mass scanning of private messages. Here is why
On 22 November, the European Parliament officially adopted its position on the draft ‘Regulation laying down rules to prevent and combat child sexual abuse’ (CSAR). With strong support...
-
Activists come to Brussels to tell MEPs to ensure everyone’s digital security amid mass surveillance measures in CSA Regulation
Between 9 and 11 October, 23 Stop Scanning Me activists from 13 European countries travelled to Brussels. They were students, parents, lawyers, young activists, human rights defenders and...