Europol is going to collect a massive collection of intimate conversations from European youngsters

This story begins with the creation of a new European agency, but ends with one of the world's largest databases of private chats and images shared by European youngsters—built by the police.

By Bits of Freedom (guest author) · November 16, 2022

Sharing is caring

The European Commission wants to be able to force technology companies to watch their users’ chats. Their aim is admirable: to combat child and adolescent sexual abuse. But the proposal is perhaps the biggest threat to confidential communications on the internet today.

Central to the proposal is the creation of a new authority, the EU Centre. That new authority is to share their office space with Europol, Europe’s law enforcement agency supporting national police forces.

This is necessary, according to lawmakers, for the “improved data exchange possibilities” and to ensure that “staff […] have more career opportunities without the need to change location.” And of course, the law regulates that the two bodies “provide each other with the fullest possible access to relevant information and information systems […]”.

In practice, therefore, the separation between the two agencies will be no more than a corridor.

Context matters

The EU Centre will be tasked with filtering reports of material of possible child abuse. Those reports come from platforms like WhatsApp or Signal. To monitor for potential sexual abuse of minors, those companies could be forced to use artificial intelligence to assess the messages of their users.

However, that technology is notoriously bad at assessing context. This is important because sharing an intimate photo in a conversation between two minors is not always harmful. In fact, many EU countries have decriminalised the fully-consensual sharing of intimate images between adolescents, recognising that this can be a normal part of sexual self-development. However, computers have a bad reputation at making that distinction because computers have difficulties interpreting the context.

If the computer flags a particular photo, video, or chat as potentially containing sexualised material, the platform will often be unable to determine whether a child is at risk, or if it is a legitimate and lawful exchange. Unless they can be certain there is no abuse, the platform is obliged to report the material to the EU Centre.

These platforms are incentivized to do so: better too frequently than too little, to minimise publicity risks.

Clearly unfounded

That EU Centre must then “expeditiously assess and process reports from platforms […] to determine whether the reports are manifestly unfounded […].” An image is only “manifestly unfounded” if there is clearly no sexual abuse. This is obvious if the picture depicts two elderly people. Or if the video shows a kitten drinking water from a bowl.

But many of the photos will show bare skin. If in those cases, it is not immediately clear that only adults are involved, it could be with young people. And then, according to the proposal, there could be abuse. And so, it is no longer “manifestly unfounded”.

This proposal harms everyone, even the very group it aims to protect.

The only way to determine whether it is actually abuse, is by finding out the context. But the EU Centre doesn’t know about the context. And so just about any report of a photo with nudity and young or age-ambiguous people in it, and any conversation that is not overtly between adults, is potentially abusive and therefore not “manifestly unfounded”.

False positives will increase the workload

In that case, the EU Centre forwards the photo, chat, or video to the police of the relevant member state. If the user is Dutch, the report is thus forwarded to the Dutch police. They will have to investigate whether there really is abuse. The police must then look for the context: the story behind the photo or conversation.

They have investigative powers to do so. So, they can knock on the platform’s door, for instance, to investigate the user’s conversation history. Then it will become clear whether there is abuse. Or not (as often will be the case).

The Irish police already has this experience: of the more than 4,000 photos they were forwarded through a US children’s rights organisation last year, only 20% of reports did involve child abuse for sure. (Perhaps even more worrying: the police keep identifying data on all reports. All. So even those reports that have been determined to have nothing to do with child abuse. Just imagine being registered in such a database).

Police build collection of sensitive and private conversations

Speaking of databases… If, in the EU Centre’s estimation, the report is not “manifestly unfounded”, a copy of that report must also be sent to Europol.

So, the report is then passed across the corridor into the hands of Europe’s aspiring ‘FBI’. Europol thus gets access to a gigantic set of sensitive photos, videos, and chats, which will be of young people in particular. These are all intimate photos, where in numerous instances there is nothing wrong with them at all.

Europol gets access to a gigantic set of intimate conversations of European youth

The proposal isn’t clear on what Europol should do with these reports. That will probably surmount to something like “analyse” and “correlate and enrich with other information”. The proposed legislation also does not specify how long Europol may retain those photos. And yet, many of those intimate photos, videos, and chats fall outside Europol’s mandate. That is, after all, “serious international crime and terrorism”, not sexts between consenting teenagers.

In short, if this proposal is passed, the consequences will be disastrous. If you are a young person exploring your sexuality, or just having a fine relationship, you should fear the police. You can no longer trust that your most intimate conversations to be just between you and your partner.

You do not help children and young people by making Europol the largest database of their intimate conversations.

And for politicians who are now thinking, “Good point, we should add a retention period”: you are missing the point. To protect children, there is no need at all for every photo, video, or chat that according to the proposal’s misguised critera might indicate sexual abuse of young people to go to Europol.

And certainly not if it is outside their mandate. After all, what was the purpose again? To protect children and young people, right? We can do that much better by empowering victims, making it easier for users to report abuse, increasing the capacity of the child protection squads of the police, and making sure perpetrators are punished. Not by reading into everyone’s confidential communications, or letting Europol host what could quickly become the world’s largest database of intimate adolescent chats and photos.

Contribution by: Rejo Zenger, Policy Advisor, EDRi member, Bits of Freedom