Chat Control is in the final stretch – but it could be a marathon, not a sprint

With final negotiations on the controversial CSA Regulation underway, you’d be forgiven for thinking that our digital rights are out of the woods. However, even though the recently-agreed position of EU Member States is a cautiously optimistic step, we are still far from a final deal. Perhaps the most worrying issue that remains is the threat of age verification becoming mandatory across all digital methods of private communication – a hugely disproportionate limitation on our privacy and free speech.

By EDRi · February 25, 2026

A 180-degree turn in Council

After seven EU countries in a row failed to broker an EU Member State position on the controversial CSA Regulation, in late 2025, the government of Denmark pulled off what had previously seemed near-impossible.

One camp of countries, including Denmark themselves, had been fiercely pushing for mandatory scanning of private messages at scale, including in encrypted environments. Another camp, spearheaded by countries including Poland, Czechia and the Netherlands, was boldly pushing in the other direction. These countries rightly urged their peers not to let the important goal of this law be used as justification to implement mass surveillance or undermine the protections offered by end-to-end encryption.

That’s why on November 13 2025, Denmark’s proposal to eliminate forced detection entirely, and add protections for encrypted communications, came as such a surprise. Could it really be that after so many years of deeply polarised debates, EU government leaders have finally agreed that even when pursuing the most important societal goals (like protecting children), basic fundamental rights, constitutional protections and technical reality still apply? Did the citizen campaign FightChatControl.eu play a role? And how has this been possible even though Germany, previously a data protection champion, reportedly removed its long-running opposition to “chat control”?

Whatever the reason(s), this unexpected but welcome agreement in Council means that any final deal on the CSA Regulation should not include provisions that could force services like Signal or WhatsApp to scan all their users private messages. Nor should the law force or coerce them into undermining or weakening their encryption. Both of these are key provisions of the Parliament’s position, adopted back in 2023. The Council deal is therefore a really important step for the protection of human rights in the digital context for people across Europe and the world!

What could this mean for the next steps

None of this means the Council’s position is perfect or that the final law will be great – and there’s still a long way to go until we reach that point. The government of Cyprus, currently holding the Presidency of the Council, has planned for the very final negotiations to happen on 29 June 2026. But with discussions already reportedly facing delays, a summer deal seems increasingly at risk.

Moreover, even though the most legally-unsound parts of the Commission’s original proposal are gone from the Council’s position (called a “general approach”), many provisions remain worrying. For example, the Council position doesn’t require national authorities to be independent, and a provision on search engine delisting risks taking attention away from the fact that the best way to tackle illegal content online is to remove it at its source – not hide it.

This development also puts the final negotiations in a strange position. EU laws have to be agreed upon by two institutions: the Council (as representatives of all EU Member State governments) and the Parliament (of 720 democratically-elected officials). As already noted, on some of the most critical parts of the proposed CSA Regulation, the Council and Parliament fortunately seem broadly aligned.

However, as always, the devil is in the detail. The Council no longer wants any mandatory scanning, proposing instead a permanent “voluntary” scheme, a bit like the 2021 interim ePrivacy derogation, which was officially extended in 2024 to much criticism, including from EDRi. Meanwhile the Parliament – long seen as the champion of fundamental rights in this debate – does want mandatory scanning, but only in narrow circumstances, where there is reasonable suspicion of a crime (similar to the idea of a warrant).

This debate was made more complicated when, on 19 December 2025, the Commission gave the Christmas present that no-one wanted: a proposed second extension of the interim ePrivacy derogation (aka “Chat Control 1.0”). Fearing that negotiations on the long-term framework will drag on, the Commission proposed adding an extra two years to the temporary framework, despite a serious lack of proven necessity and proportionality in the temporary framework.

Lastly, even though the Parliament and Council are technically the only two EU legislators, the European Commission plays a facilitative role which gives them a lot of influence. This bill has seen a long and scandalous history of inappropriate and at times illegal Commission interference. Although the law is no longer under the control of ex-Commissioner Ylva Johansson, it seems that the Commission is still fighting hard to preserve their original “Chat Control” proposal. This means that the likely outcome of final negotiations (“trilogues”) remains unclear.

One of the biggest contemporary threats to our civil liberties remains

One more issue that EDRi is watching with serious alarm is that of age verification or age assurance – where people’s access to certain services becomes contingent on providing their identity documents, faces or behavioural data. It comes with a lot of risks to the digital rights and digital security of both adults and children, but the Commission and Council both want to make it mandatory for services deemed risky by the CSA Regulation. The risk framework of the proposal is very problematic, meaning that any secure and privacy-respecting service would automatically be considered risky, and therefore required to implement a mass surveillance infrastructure of identity checks for their users.

This would mean that the future CSA Regulation would require people to use their sensitive face data or to provide their government-issued IDs in order to send private messages (like on Signal or WhatsApp), send emails or to download apps on their phone. Otherwise, they could be locked out of private digital communications entirely.

This is an enormous threat to our basic right to a private life, our free expression and our dignity, because we will no longer be able to communicate digitally without government or corporate oversight. The risk of a chilling effect on adults and children seeking to communicate or seek information legitimately is enormous, whistle-blowing about corruption could disappear entirely, and professional secrecy for doctors, therapists or lawyers could be undermined.

Think about it. Would you feel comfortable engaging in activism, speaking to a journalist, chatting with your friends or communicating with your partner if you knew your chats were being linked to your face or your government ID?

People without a digital ID, or the ‘right’ kind of digital ID (which could include young people, elderly people, undocumented people and those from backgrounds facing high levels of structural exclusion) would be de-facto banned from communicating privately online. Systems that use facial analysis are also notoriously inaccurate for racialised people as well as people with facial differences – meaning disproportionate barriers for these communities, too.

While better protecting both adults and children alike in the online world is an important policy goal, age verification is a false solution, especially when it comes to private messages.

From a legal clarity and legal certainty perspective, the CSA Regulation is also a deeply flawed place for such requirements. The CSA Regulation is supposed to complement the EU’s platform regulation framework, especially the Digital Services Act (DSA). The DSA’s Article 28 already permits the use of age verification for social media and similar platforms, where individual cases are shown to be necessary and proportionate.

However, age-gating everyone’s chats, locking high numbers of people out of private online spaces, and having governments or corporations looking over our shoulders when we communicate digitally is not the answer.

While not perfect, the European Parliament’s position in terms of age verification is by far the least harmful: if providers in scope of the law use age verification then the tool must meet several high privacy and security guarantees. If the final CSA Regulation goes beyond this threshold, it will be essential for the digital rights community to come together to prevent what might otherwise be one of the biggest contemporary threats to our digital civil liberties.

Ella Jakubowska (She/Her)

Head of Policy