High-Level Group “Going Dark” outcome: A mission failure

On 13 June, the Justice and Home Affairs Council, composed of EU Member States’ ministers of the Interior, will discuss the recommendations of the High-Level Group (HLG) on Access to Data for Effective Law Enforcement (“Going Dark”). This blogpost provides a short analysis of the HLG’s recommendations and a summary of its procedural flaws.

By EDRi · June 13, 2024

On 13 June, the Justice and Home Affairs Council, composed of EU Member States’ ministers of the Interior, will discuss the recommendations of the High-Level Group (HLG) on Access to Data for Effective Law Enforcement (“Going Dark”). The HLG was tasked to map the current challenges faced by law enforcement when seeking access to data and to identify “commonly accepted solutions”.

Based on the assessment of their 42 recommendations, it must be concluded that the HLG failed at its primary mission. Whereas the HLG was expected to deliver “potential solutions (…) in full respect of fundamental rights”, its recommendations are merely a rebranding of old ideas and policies, which have been deemed inadequate and dangerous by cybersecurity, data protection and fundamental rights experts for years. The risks at stake for people’s rights are high, as this initiative constitutes a renewed lobbying offensive by law enforcement to push for mass surveillance and encryption backdoors in the future digital agenda of the EU.

Indeed, their recommendations for future EU policies and legislation aim to guarantee the maximal access possible to data for law enforcement authorities, even if it bluntly contradicts the reality of today’s digital security systems as well as privacy and data protection legal safeguards. This outcome reflects the deeply flawed political orientation of the whole initiative, which included law enforcement interests only and refused to meet EU standards for balanced interests representation, transparency and accountability.

If the EU wants to reaffirm its commitment to fundamental rights and democratic values in the upcoming new mandate, these recommendations should not be considered reliable guidance for any future legislative action. This blogpost provides a short analysis of the HLG’s recommendations and a summary of its procedural flaws.

Nothing new under the sun

The HLG seems to have run the exact same conversations that the law enforcement community has been rehashing internally since the 1990s. Only this time they attempted to introduce new concepts or reframe existing ones in an effort to cover their political agenda: giving police forces the most unrestricted access possible to (encrypted) data.

First, the HLG attempts to redefine the well-established concept of “security by design” to include law enforcement access to all data, including through encryption backdoors. The purpose seems to be to counter the current state of the public discourse concerning privacy and security, which the HLG specifically highlights as a factor that might have negatively affected the development of legislation for lawful access. This attempt to reframe the public debate is very misleading, especially when the actual effect of the HLG’s policy recommendations would literally be “insecurity by design”.

Indeed, the HLG recommends the implementation of “lawful access by design” in the development of all technologies “in line with the needs expressed by law enforcement” (recommendations 22, 23 & 24). In other words, forcing all tech (hardware and software) producers to build vulnerabilities in their systems designed to be exploited solely by law enforcement when needed. The imperative requirement that it should “[ensure] at the same time strong security and cybersecurity” is still very vague, and still at this point in time, wishful thinking.

Multiple past stories of hacking show that it is only a matter of time before built-in vulnerabilities are exploited by unintended, and possibly malicious, actors (such as organised criminals, corrupt employees, or foreign intelligence agencies). Backdoors in every tech system would be a very severe encroachment on everyone’s privacy and online security.

The HLG defers the hardest nut to crack – i.e. accessing encrypted data in clear without breaking encryption – to yet another “research group”, in charge of assessing the “technical feasibility” of lawful access (point 26). In reality, the HLG is making policy recommendations based on magical thinking that a technological solution to this long-standing conundrum can suddenly be developed, even though attempts have consistently failed for the past 30 years. Yet, we know that the latest “technological solutions”, pitched as secure and privacy-preserving, are in fact much more privacy invasive, enable bulk surveillance and increase the risks of security failure.

According to documents obtained by an MEP, the Belgian police suggest selling the concept of “frontdoors”, inspired by their last domestic legislative proposal. Service providers would be required to “switch off” encryption for targets of investigation and intercept their data – meaning that past data would not be retrieved, only “future” data would be collected. Needless to say, this would have the same practical effects on digital security as any other backdoor. All in all, the HLG’s contribution to this question is practically void.

Second, with regard to data retention, the HLG conveyed yet again the same complaints by the Member States that the Court of Justice (CJEU) requirements are “difficult” to implement. Some “experts” suggested simply ignoring the limits and safeguards developed by the CJEU for retention by focusing solely on regulating access to data, all the while noting that it might not comply with the well-established jurisprudence of the Court.

The HLG calls for a new harmonised data retention regime at the EU level, after the annulment of the previous one by the CJEU in 2014. Such a framework would apply to all internet service providers (including OTTs for which there are legal, jurisdictional and technical obstacles that are likely to render this proposal infeasible) and force them to hand over decrypted data. The magical thinking of the HLG is that all OTT providers, with the help of an EU standardisation body, can be tweaked into operating like traditional telecom providers and collect the same type of metadata about communications.

Putting tech companies and service providers in charge of impossible solutions

The HLG places a strong emphasis on the role of the private sector in achieving law enforcement objectives:

  • Service providers are expected to be transparent about which data they process, to collect user data they don’t need, just for law enforcement benefits, and to figure out how to provide data in clear when receiving a law enforcement request for interception, without compromising the security of their systems. This is simply technically impossible to do so once “access by design” has been “standardised” into everything.
  • Very harsh sanctions are foreseen to deter and punish non-compliance, ranging from administrative sanctions and limits to operating in the EU market to imprisonment (for offering secure communications services or complying with domestic laws in third countries). The HLG also attempts to differentiate between providers of electronic communications that would be “non-cooperative” and the “criminal” ones, which are “designed to offer services solely or mainly to criminal actors”. How this distinction is made remains however vague and undefined. Which risks targeting small, community-led and not-for-profit privacy-focused service providers or driving reliable operators outside the EU market.
  • The HLG’s ambitious enforcement model also aims to extend the scope of application to virtually all tech companies, including hardware manufacturers (e.g. smartphones and storage devices), Over-the-Top (OTT) services (e.g. Telegram, Zoom, iMessage etc.) and the internet of things industry. While mandatory backdoors at the hardware level would create even more opaque systemic vulnerabilities in the European digital infrastructure, open vulnerabilities in connected cars and smart objects would expose the most intimate aspects of our lives and our security to cyberattacks.

Promoting ever more privacy-intrusive methods while undermining protections

Another worrying aspect of the HLG’s recommendations relates to the political repercussions of European-wide mass hacking operations EncroChat and SkyECC. These operations have raised numerous concerns in terms of fundamental rights and rule of law violations. However, the law enforcement community seems eager to harness the momentum of these “police success stories” by demanding that such intrusive and controversial methods be turned into permanent powers.

The HLG therefore demands cheaper and easier access to digital forensic tools such as decryption software, and more effective forum shopping mechanisms (when one country carries out investigative operations for another one), such as infrastructures allowing transfers of large volumes of data and shared rules on evidence admissibility.

It further advocates for an extension of the e-evidence rules – that EDRi criticised as a dangerous bypass of essential judicial processes and safeguards in the field of judicial cooperation – to live data interception, also in third countries like the US, based on international agreements. This is an extremely intrusive investigative technique and the weak e-evidence framework would be nowhere near sufficient to prevent abuses. Going further, the HLG proposes a revamp of the concept of territorial jurisdiction, in order to exclude certain investigative cases from the scope of cross-border cooperation, which would in practice exempt investigative authorities from complying with procedural rules and undermine domestic protections.

Very selective listening and limited transparency

The HLG has kept its work sessions closed, by strictly controlling which stakeholders got invited and effectively shutting down civil society participation.

When digital rights advocates, led by EDRi, sought to contribute their expertise to the HLG’s sessions and activities, our requests were first rebuffed. We were instead advised to submit written comments, with no guarantee that these would be meaningfully considered. Additionally, when finally making it to the round table in a public consultation as one of the only civil society representatives, we faced a number of obstacles that limited our interventions. For example, we were given misleading instructions about the topics we could speak on and the timings of our interventions. This masquerade of a consultation ended up being a fruitless back-and-forth between the European Commission’s (which acts as the HLG Secretariat) representative and us. There was no possibility of genuinely engaging with the HLG members. Meanwhile, we learned from the press that industry players had received invitations to the HLG private meetings.

The HLG made no effort to address critical viewpoints. Its report on the public consultation only contains a summary of the stakeholders’ written contributions but no actual response. The 42 recommendations from the HLG make it abundantly clear that all input from civil society and data protection experts was simply ignored, despite the HLG’s claims to the contrary.

Furthermore, it took DG HOME almost three months to respond to EDRi and 20 other NGOs which at the early phases of the HLG process pointed out a worrying lack of transparency and accountability. Following our complaint, DG HOME conveniently changed the group’s rules of procedure, removing some of their original transparency requirements, such as its registration on the Register of Commission expert groups, the publication of its member composition and meeting minutes. This exemption from standard transparency requirements reinforces the impression that the Commission prefers to keep crucial debates for democracy and people’s rights behind closed doors.

As a result, the HLG has created opaque and unequal participation processes, which created an unbalanced representation of interests. Such an approach undermined one of the core objectives of the HLG: supposedly to foster interactive participation among all stakeholders and encourage the sharing of diverse perspectives.

What can be expected next?

The “High-Level” group has failed to present any viable solution to a problem that no one has been able to solve in the last 30 years. Instead, they came up with new creative ways of publicly framing encryption backdoors. Furthermore, they intentionally avoided addressing the criticism voiced repeatedly, not just by civil society, but also by cybersecurity experts and institutions, that encryption backdoors will create systemic vulnerabilities and undermine our cybersecurity infrastructure.

The HLG is supposed to deliver a longer “concluding report” to provide more details about their recommendations in the autumn of 2024. However, we do not expect any groundbreaking ideas from this report either. We will continue to monitor and report on the influence of this process on the Council’s and Commission’s agenda in the next EU mandate.

Contribution by: Chloé Berthélémy, Senior Policy Advisor, EDRi and Jesper Lund, Chairman, EDRi member IT-Pol Denmark

Read more