Europol’s reform: A future data black hole in European policing

The European Parliament is soon due to vote on the powers expansion of the European Union’s law enforcement agency, Europol. Civil society has been extremely critical of Europol’s mandate revision, raising many concerns with regards to the lack of fundamental rights protections and policymakers’ blind and absolute trust in how the agency will use its new powers. All the more reasons to be worried: the result of the trilogue negotiations with the Council of the EU made it even worse.

By EDRi · April 20, 2022

The final text agreed between the European Parliament and the Council of EU revising Europol’s mandate confirms the numerous fears of EDRi and other civil society groups. The agreement, which only needs the final approval of the Parliament’s plenary to enter into force, went further than the original Commission’s proposal in absolving the police agency of its past illegal data practices, limiting the powers of its watchdogs and giving it a blank check to use predictive technologies. If adopted, it will transform Europol in a data black hole at the centre of European police cooperation. 

Here are a few examples of how the co-legislators watered down the meager safeguards the Commission had introduced to regulate Europol’s new powers: 

Unlimited data collection and use for ‘research and innovation’

With the reform, Europol officially receives a legal mandate to carry out its own research and innovation projects, which implies the development, training, testing and validation of algorithms for the development of policing tools. The absence of a legal basis did not prevent the agency from already creating a research infrastructure and activities like the Europol Innovation Lab, with the support of the Member States’ Ministers of Interior

While the Commission’s proposal had restricted the use of personal data for that purpose to data falling into the categories that Europol is legally allowed to process (listed in Annex II of Europol’s Regulation), the political agreement between the Council and the Parliament removes this limitation.This will allow the agency to use any type of data it receives, regardless whether it comes from third countries, private companies or national police authorities, and regardless of whether the personal data concerns people connected to criminal activities within Europol’s mandate 

The legislators justify this leeway by assuming that feeding more data to the algorithms will “prevent bias in algorithmic decision-making” (recital 37). This shows a profound misunderstanding of the harms caused by so-called “debiasing techniques” and of their unsuitability to address the discriminatory effects of AI. EDRi’s paper “Beyond Debiasing. Regulating AI and its inequalities” has demonstrated how debiasing approaches oversimplify complex problems of injustice and lead to heightened risks of privacy and data protection rights violations. This legitimises the collection by Europol of an unlimited and uncontrolled amount of data for research projects, some of which will later be put into operation for predictive policing and other inherently discriminatory law enforcement activities.

Secret informal data exchanges with private companies

The new mandate authorises Europol to receive datasets voluntarily shared by private companies, including online platforms like Facebook, banks and transport companies, to process and analyse them and to share the results with relevant national authorities (including in third countries). Furthermore, Europol is allowed to transmit personal data to the private company and request additional information in case the information received is insufficient to determine the national jurisdiction which is responsible. The negotiators specified that this can include any type of data: subscriber, traffic and content data. Does this voluntary disclosure of possibly sensitive data meet the legality requirements under EU law? Likely not, and the reform is moot on this point. Will procedural safeguards be guaranteed in this process? Certainly not. Will the individuals affected be informed of such transfer? Definitely not. The proof is that the reform introduces another procedure for requesting additional data from private parties via Member States’ national laws. 

Furthermore, Europol’s role in online content moderation is stepped up. The agency already has an in-house “internet referral unit”, responsible for scrolling through the web and looking for content that is potentially “incompatible” with the terms of service of online service providers, so they can decide to delete it or not. It will now be able to exchange personal data, notably hashes (to feed their huge databases and filtering systems), IP addresses and URLs with these private companies. While the Commission’s original proposal restricted the data exchange to “crisis situations“ involving the rapid dissemination of terrorist content, the legislators extended Europol’s task to refer content and share information to child sexual abuse material, on a continuous basis. This reinforced “cooperation” with private parties is deepening the existing problems of lack of transparency and judicial supervision in Europol’s activities, which could lead to serious fundamental rights violations. Moreover, Europol will, in practice, get an operational capacity on its own in policing online content, despite such a role being incompatible with its mandate in the EU Treaties.

Shrinking oversight powers 

We already pointed out that the Europol’s reform in general significantly expands the agency’s operational capacities without a matching reinforcement of its oversight bodies. The agreement reached during the trilogues certainly did not improve the accountability mechanisms. The European Data Protection Supervisor (EDPS) will only be consulted when “new types of processing operations” are launched by Europol, leaving specific operational activities unchecked. It is impossible for the EDPS to suspend the deadline for providing an opinion, even in cases where the EDPS requires more resources or information. Europol is even allowed to initiate processing immediately after the prior consultation has been launched if it considers that this processing is of “substantial significance” for the performance of its tasks – a loophole that can easily be abused. 

Europol will be in charge of assessing itself if it needs to process data outside of the permitted categories in Annex II to support a criminal investigation: the EDPS is only “informed” of this assessment, with no possibility to contest or amend it, after Europol stops supporting the investigation (which can take years).

During the negotiations, the Parliament suggested to appoint a Fundamental Rights Officer (FRO) “to support Europol in safeguarding the respect for fundamental rights in all its activities and tasks”. Unfortunately, the FRO will be chosen by Europol’s Management Board among the staff. Even with the best “training in fundamental rights law and practice”, this person won’t be able to make independent decisions or effectively hold the internal hierarchy to account when violations are identified. Similar conclusions can be drawn from the Joint Parliamentary Scrutiny Group: being invited twice a year to join the Management Board meetings without voting rights and with access to only limited, consolidated information on Europol’s activities won’t improve the oversight of its day-to-day work. 

As a result, EDRi and its partners call on the European Parliament to reject the reform of Europol’s mandate. It is urgent that MEPs acknowledge the considerable risks of violations of the rights to a fair trial, to privacy and data protection, to freedom of expression and to non-discrimination that Europol’s future powers could lead to. At the time where police action at national level is put into question, Europol should be subjected to the same level of scrutiny, and not be allowed to become a data black hole supporting discriminatory policing practices throughout Europe shrouded in secrecy.

Image credit: Max Fleischmann / Unsplash

(Contribution by: Chloé Berthélémy, Policy Advisor, EDRi and Jesper Lund, Chairman EDRi member, IT-Political Association of Denmark )