It’s time for a heart-to-heart about the EU’s surveillance agenda
The EU prides itself on its worldwide norm-setting influence in the fields of data protection and artificial intelligence regulation. Still, it is not always for the best when it comes to digital state surveillance. Privacy is safety. As we approach the European elections in June, it’s time to discuss the EU's role in shaping how technologies are developed and used.
In March, highly sensitive military information was leaked on Russian TV. The German Air Force had used poorly secured communication software, which resulted in Russia’s interception of their top-secret conversations.
This was the German Taurus leak — a real-life drama that underscored the importance of encrypted, secure communications for everyone’s safety, including governments.
But here’s the kicker: while we’re still reeling from security fiascoes like this one or the recent Pegasus spyware scandal, police chiefs in the European Union are pushing a policing agenda that puts us all in the cross-hairs.
Privacy is safety. As we approach the European elections in June, it’s time to discuss the EU’s role in shaping how technologies are developed and used.
The EU prides itself on its worldwide norm-setting influence in the fields of data protection and artificial intelligence regulation. Still, it is not always for the best when it comes to digital state surveillance.
Over the past five years, we’ve seen a concerning trend in the EU’s digital policy playground. Some EU bodies have been pushing for tech solutions that enable some very worrying policing practices, like monitoring all private communications or criminalising the use of encryption, as part of a wider objective of increasing prosecution and imprisonment to silence activists and NGOs.
For example, on 21 April, 32 European police chiefs issued a statement under the aegis of Europol (the EU’s police agency), calling upon the technology industry to stop rolling out end-to-end encryption and to build backdoors in their systems so that companies and law enforcement can gain access to data and monitor communications.
This technosolutionist trend is like a bad dance partner, stepping on the toes of people’s fundamental rights. But these are not the only dodgy dance moves we have seen.
The EU has implemented this tech-powered security agenda in ways that limit civil society participation, favour the surveillance industry and jeopardise digital safety.
DG HOME’s very selective listening
One of the main characters in this story is the European Commission’s Directorate-General for Migration and Home Affairs, or DG HOME.
DG HOME has a worrying track record of pushing an agenda that suggests the only way to achieve security is through surveillance and control. In their quest to tackle the big bad wolves of cybercrime, cross-border crime, and terrorism, DG HOME has thrown caution to the wind. Transparency, accountability, and democratic participation? Sacrificed for a facade of security. The proof is in the pudding, as shown by their handling of the Child Sexual Abuse Regulation (CSAR).
Our hands-on experience on the chat control proposal has revealed evidence of how DG HOME has wrongly framed tech policing as the ultimate solution to complex societal issues such as children’s safety.
In attempting to legalise mass surveillance and expand policing powers in the EU, DG HOME trampled many of the EU’s primary democratic standards. As the directorate general tasked with the CSAR proposal, DG HOME was responsible for conducting a transparent and inclusive consultation process to ensure all stakeholders’ views and concerns were heard.
Evidence shows that DG HOME prioritised meetings with Big Tech companies instead, notably Google, Twitter, Microsoft, Apple, Meta/WhatsApp, TikTok, and Snap.
And tech surveillance industry actors like Thorn, a US company specialising in AI tools for online child sexual abuse image detection. This close collaboration continued for months after the CSAR proposal’s publication. For example, DG HOME repeatedly facilitated Thorn’s access to crucial decision-making venues attended by ministers of EU member states.
Ill-suited proposals and a lack of real solutions
While making space for the industry, not once in this period did DG HOME respond to calls from digital rights organisations located meters away from their offices in Brussels, asking to explore social and human interventions as part of a holistic rather than tech-centric approach to the issue of child sexual abuse online.
DG HOME and the European Commissioner in charge of home affairs and leading on the chat control proposal, Ylva Johansson, not only refused to meet with data protection organisations but also openly misled the public about having consulted these groups in an attempt to legitimise their actions.
This is alarming, as DG HOME’s actions have skewed the debate around the CSAR into a one-sided discussion, sidelining organisations critical of the proposal. By excluding dissent from the political space, DG HOME disregards essential human rights like privacy. It fails to account for the complex societal nature of child sexual abuse, resulting in an ill-suited and likely unlawful proposal that does not offer a real solution.
An accompanying journalistic investigation revealed that Europol sought unrestricted access to data from the CSAR’s mass scanning system, with no objections or privacy concerns raised by DG HOME.
The EU Ombudsman is now investigating this clear conflict of interest, which involves former Europol officials lobbying for Thorn and its potential risk to civil rights.
Why is the door repeatedly slammed in our faces?
DG HOME’s choice to exclude civil society voices advocating for sustainable measures over blanket surveillance of children and adults alike and to prioritise the interests of the surveillance tech industry and law enforcement is a telling indicator of the directorate’s policing-driven agenda.
Considering that digitalisation and tech innovation are on the high priority list for the June 2024 EU elections agenda, policymakers must engage in a transparent and democratically-run debate on what security means, for whom, and how it can be achieved.
We need stricter rules on corporate lobbying, particularly for groups like Thorn, who abuse the NGO arm of their organisation to obfuscate their for-profit work. We also need a fair and transparent participation process in the legislative cycle on digital policy that ensures a formal seat for civil society groups at the table.
EU countries’ governments must also ensure that the next European Commissioner for Home Affairs has an understanding of human rights and the rule of law. When proposing technological solutions, the European Commission must ensure that the lead staff has expertise in data protection, privacy, technology and internet regulation.
The EU must end its shadowy games with the industry, engage in meaningful transparency and respect our fundamental rights.