Be scanned – or get banned!

In the latest in a string of alarming developments, the Belgian government has proposed a new supposed 'solution' to the Chat Control deadlock in the Council. Read why this new proposal undermines people's security across the European Union.

By EDRi · May 30, 2024

In the latest in a string of alarming developments, the Belgian government has proposed a new supposed ‘solution’ to the Chat Control (Child Sexual Abuse Regulation) deadlock in the Council. Providers of private communications services must ask people to consent to faulty AI-based scanning of their private chats, they suggest – or be banned from sharing images, videos and URLs!

Belgium – currently holding the unenviable role of trying to broker a compromise between European Union (EU) Member States – claimed earlier this year that they had found a “more proportionate” way forward on this controversial law. Governments of countries including Germany, Austria, France, Poland, the Netherlands, Italy, Estonia and Finland have reportedly raised concerns throughout the process. However, they are now under even more extreme pressure to reach a Council deal on the CSA Regulation, even if they have to give up on digital rights and technical reality in the process.

Now, a proposed Council text dated 28 May (9093/24) reveals a harshly different reality from Belgium’s promise of a human rights-respecting CSA Regulation. This new draft text carries over the many issues we’ve recently warned about (such as a shocking misunderstanding of statistics when images are wrongly flagged, a failure to respect vital end-to-end encryption, and a risk categorisation methodology that punishes secure and privacy-respecting services whilst rewarding those that systematically violate people’s privacy). It also adds two new major problems in addition:

1. Forced consent

The premise underpinning this new Belgian approach seems to be that if people consent to scanning, then it isn’t mass surveillance. This is a severe misunderstanding not only of mass surveillance, but also of consent. Consent to data processing, under EU law, needs to be given freely. It cannot be forced under the threat that without giving such consent, you will no longer be able to share pictures with your family, send links to your colleagues, and much more. Big Tech platforms have for years coerced users into accepting unlawful data processing practices as a condition for using the service. With the latest text, EU governments will literally become copycats of Big Tech’s abusive practices and dark patterns.

CONSENT to data processing, under EU law, needs to be given FREELY!

There’s the additional problem that it’s hard to see how such a mechanism could be effective. Those looking to distribute child sexual abuse material (CSAM) will simply refuse consent to be scanned, and will instead move to another service when a detection order forces the service provider to restrict the sharing functionality for non-consenting users. People who remain on the service, however, will still be caught in the drag-net of flawed AI surveillance – from teenagers whose consensual sexual imagery will be flagged and reported, to those whose content is mistakenly flagged as CSAM.

2. A rose by any other name…

This new Belgian approach has been coined “upload moderation”. However, despite a nicer-sounding name, what’s on the table would still be the same thing: the mass scanning of the private communications of people who are not suspected of any crime, even in end-to-end encrypted environments. Technology and cybersecurity experts have repeatedly warned that this cannot be done safely and securely – putting at risk the private communications of activists, journalists, young people, businesses and even governments!

Client-side scanning will UNDERMINE private communications and leave people VULNERABLE to malicious attacks.

Worse still, the Belgian proposal claims that end-to-end encryption is not compromised if the “upload moderation” (scanning) is done before the message is encrypted. However, this essentially demands the use of client-side scanning – a form of spyware on end-user devices. Cybersecurity experts have pointed out that client-side scanning will undermine the security of private communications and leave people vulnerable to a number of attacks by malicious actors:

“CSS by its nature creates serious security and privacy risks for all society while the assistance it can provide for law enforcement is at best problematic. There are multiple ways in which client-side scanning can fail, can be evaded, and can be abused.” – Bugs in Our Pockets, Abelson et al (2024)

Given the risk categorisation approach proposed by the Belgian government, which shockingly posits more secure and more privacy-respecting services as more dangerous, most end-to-end encrypted communication services are likely to be hit with a detection order. This will critically undermine the security of the service – which would effectively be the end of secure end-to-end encrypted communications in Europe.

There’s a reason why we’ve always said the EU needs to take a different approach. It’s because there is no way to mass scan and report on people’s private communications without violating the core of their right to privacy and creating massive security vulnerabilities. The search for a magic technical solution is doomed to fail because there is no magic solution to the serious problem of child sexual abuse – which instead requires multi-faceted societal-level and political solutions.

There is NO magic solution to the serious problem of child sexual abuse. We need multi-faceted societal-level and political solutions!

If EU Member States take the step of agreeing to Belgium’s new approach, they will effectively ban the sharing of pictures, videos and links in the EU by only “allowing” this type of content in private communications under mass surveillance. It might sound like an episode of Black Mirror – but unless governments stand up for our rights and scientific evidence, it could be coming to all our devices soon.

Read more