Denmark wants to break the Council deadlock on the CSA Regulation, but are they genuinely trying?
Denmark made the widely-criticised CSA Regulation a priority on the very first day of their Council presidency, but show little willingness to actually find a compromise that will break the three-year long deadlock on this law. The Danish text recycles previous failed attempts and does nothing to assuage the valid concerns about mass surveillance and encryption. Not only is Denmark unlikely to be able to broker a deal, it also stands in the way of EU countries finding an alternative, meaningful, rights-respecting solution to tackling CSA online.
The recycling of (multiple) failures: the Danish winning recipe?
The EU Member State that holds the Council Presidency sets the institution’s priorities for six months, and proposes compromises between diverging views. Denmark took over the rotating Council presidency on 1 July 2025. On the very first day, they proposed a compromise text for the Council’s position on the widely-criticised, long-negotiated draft EU CSA Regulation, showcasing this law as one of their flagship priotities – a bold and unexpected move.
Yet, so far, Denmark has displayed little willingness to genuinely find a compromise that would break the deadlock that has immobilised the file for the past 3 years. Instead of trying to build on the Polish Presidency’s much-needed bridges, Denmark simply reverted to a mix of past versions of the text, each of which had (thankfully) failed to secure sufficient support due to valid concerns about mass surveillance and the impact on end-to-end encryption.
In reality, Denmark could propose its own compromise text so quickly because, 1) it’s essentially a remix of previous proposals (mainly the Belgian and the Hungarian texts), which had already failed to break the Council deadlock for three years, and 2) their proposal retains nothing from the previous presidency’s version which had genuinely tried to bridge the two camps while minimising the damage to confidentiality of communications and end-to-end encryption. Denmark organised a discussion on their proposal on 11 July 2025 and ambitiously (or perhaps wishfully) project a deal being reached as early as 14 October 2025.
Overall, it seems like the Danish government is not actually trying very hard at making meaningful progress. If they were, they would not have recycled previous iterations of the text that were proven to lack sufficient political support. Instead, they would have taken some steps to assuage the serious concerns regarding mass surveillance and the impact on end-to-end encryption which would otherwise make the law ripe for annulment at the Court of Justice.
Taking a closer look at Denmark’s proposed compromise text on the CSA Regulation
Denmark’s ‘new’ text contains:
- the possibility to impose mandatory detection orders on part or whole services, for both ‘known’ (previously identified) and ’new’ CSAM, as already proposed – and rejected – in the Belgian text (p50 of the Danish compromise proposal). This would amount to mandatory mass scanning of people’s private messages, which is not lawful in the EU. As explained in our position paper, this detection would be automated through AI, despite such tools being unable to take the context of a message into account, and despite the consequent high number of mistakes in their reporting which would inevitably lead to wrongful accusations, drown genuine cases in false alerts, and criminalise teenagers’ consenting and private sexual self-expression.
- Like in the Belgian text, the detection algorithms would scan visual content (i.e. all photos & videos, but not standalone audio content), and would scan text purportedly only for any URLs it might contain. Scanning text for grooming (which requires textual analysis, and is thus even more intrusive and error-prone than the rest) would initially be out of scope, but a review clause would allow for it to be added back into the scope three years after the passing of the law (and every three years thereafter, until it finally – inevitably – is added to the scope). (p150)
- Like in the Hungarian text, it is stated explicitly in Article 10(1) that detection orders would apply to end-to-end encrypted services via client-side scanning (which basically means that software forcibly installed on the device would scan the content of the message before it is encrypted and sent; see our detailed explainer). Users would be forced to ‘consent’ to this scanning under the service’s general terms and conditions despite the serious infringement on their privacy and digital security. Non-consenting users would still be able to use the part of the service that does not involve the sending of visual content and URLs, which is a meaningless safeguard given that virtually all private communication services include visual and URL-sharing capabilities (p16 and 60). Regardless of the façade of consent, this measure would undermine not only the rationale behind using end-to-end encrypted services but also the integrity of all devices – whether or not they have ‘consented’.
- The Belgians’ idea of categorising services under a risk classification of low, medium, and high risk, is reused in the Danish proposal. However, this is not a meaningful safeguard, as every end-to-end encryption service would qualify as high risk, and would therefore get scanned (cf. footnotes 37-39, p194).
- The Danes retain the Commission’s idea of using age verification, notably as a way to prevent child users from accessing messenger services and downloading apps where they could be approached by adults (p45). Such a crude measure would risk the exclusion of millions of users from much-needed apps and services (not only children but also adults because not every adult in the EU has a digital ID), disregard children’s autonomy and agency, and create severe privacy and data protection hazards (more information in our position paper on age verification). Service providers would be pressured into systematically verifying the age of their users, because if they don’t, they would fall into the ‘high risk’ category, and therefore be the target of detection orders.
Given many of these elements are precisely the reason behind the long-running opposition of the blocking minority (the 11 member states stopping the CSA Regulation from passing in the Council), it is evident that the Danes run yet another projected failure. Their attempt seems more like a PR exercise rather than a genuine try at moving things forward – and indeed, the distribution of countries in the pro-Chat Control and blocking minority camps (discussed in our previous blog) remains unchanged, as per the minutes of the latest meeting on the Danish text.
What is the way forward?
This lack of political courage from Denmark is problematic because it’s not genuinely trying to break the three-year long deadlock on the law. If these Member States truly care about protecting children online, it is clear that a different solution – one that is lawful and technically feasible – will be necessary. For example, one option could be for the EU to move forward with the proposal’s less controversial measures, such as establishing an EU Center to act as a knowledge and education hub. Another option could be for the European Commission to follow the advice of France and to withdraw this intractable proposal and replace it with meaningful and realistic measures developed in conjunction with all relevant experts. Technical ‘solutions’ like the ones the Commission’s CSA Regulation proposes cannot solve complex, serious issues like CSA. When deployed proportionately, technology can play an assistive role in tackling this terrible crime – but it can never be a silver bullet.
In that light, we note that the Commission recently started moving on addressing the structural issues that either enable online grooming or prevent online CSA from being properly dealt with. They are producing guidelines on measures platforms should take in order to ensure children’s privacy, safety and security online. These include measures that would add friction to unwanted interactions and makes minors’ accounts less easy to find/contact by adult strangers. Secondly, the EU is in the process of updating the 2011 Directive on CSA, which could criminalise e.g. , for example, the possession and sharing of AI-generated CSA and CSA deepfakes, as well as the possession and sharing of “perpetrators’ manuals”. The review could also abolish statutes of limitations (the time after which the possibility to prosecute a crime expires) for CSA. It could also mean that people need to undergo a screening before working in contact with children.
