Blogs | Privacy and data protection | Freedom of expression online | Privacy and confidentiality | Surveillance and data retention

Council poised to endorse mass surveillance as official position for CSA Regulation

The Council of EU Member States are close to finalising their position on the controversial CSA Regulation. Yet the latest slew of Council amendments – just like the European Commission’s original – endorse measures which amount to mass surveillance and which would fundamentally undermine end-to-end encryption.

By EDRi · July 21, 2023

Content warning: contains discussions of child sexual abuse and child sexual abuse material

EU Member State governments are planning to adopt their official position, called a ‘general approach’, on the Child Sexual Abuse Regulation (CSAR) at the meeting of Justice and Home Affairs ministers on 28 September 2023. The Law Enforcement Working Party (LEWP) – who have been negotiating this text on behalf of their countries for the last year – will have its last meeting before the Summer break on 26 July.

Their discussions will be based on Council text 11518/23 which puts forward the latest suggested changes to the draft CSAR by the Spanish Presidency. Here, we sound the alarm about the following major issues with the Council’s latest text:

  • The proposed compromise on detection orders rubber stamps the Commission’s mass surveillance plans and severely undermines encryption;
  • Blocking and delisting have been made much easier. This means that the removal of CSAM from the internet (which child protection experts confirm is the most effective measure to protect survivors) is much less likely to be pursued by authorities;
  • Non-independent authorities will now have a key role in administering the CSAR’s measures, and policing or pseudo-policing objectives now take a much larger role (even though the CSAR does not have any legal basis for policing);
  • Attempts to protect children from harmful age verification are cosmetic, and many people could be locked out of online communications;
  • Government, and some business, communications are excluded from the law in a shocking admission that otherwise, the proposed measures would violate the confidentiality of those communications.

These latest changes may still generate opposition from a sizeable minority of Member States. This includes the government of Poland, which has been deeply critical of this law, the government of Austria, which has been bound by its Parliament from approving a CSAR which would undermine encryption or violate human rights, and at least seven other countries who have expressed concerns.

Many Member States want generalised scanning of communications, ignore legal advice

LEWP discussions about detection orders (Articles 7-11) have been intense because several Member States have raised opposition to the general and indiscriminate scanning of everyone’s private communications, in particular of end-to-end encrypted (E2EE) communications services. Encryption experts consider deployment of detection technologies to be technically impossible without critically undermining the security design of the communications service.

As we discussed previously, the Council Legal Service delivered a damning opinion on 26 April 2023 which should have provided ample support for the group of critical Member States.

The opinion warned of “a serious risk that it [the CSA Regulation] would be found to compromise the essence of the rights to privacy and data protection”, would undermine encryption, and would allow “general and indiscriminate access to the content of personal communications” by companies.

- Council lawyers

Ambassadors bow to political pressure

In an effort to settle the political disagreement and address the disparate legal assessments from the Commission’s Home Affairs department in defence of their proposal, and the Council Legal Services’ critique, discussions about detection orders were escalated to the ambassador level. At the COREPER meeting on 31 May 2023, a majority of the ambassadors supported the Commission’s approach with general and indiscriminate detection orders. They agreed that no exceptions should be made for unknown CSAM and grooming, where AI-based detection technologies are known to produce a large number of false positives, or to protect encrypted communications. Even audio communications should remain in scope of the CSAR.

To summarise the current situation, a majority of Member States either prefer the advice of the Commission over their own lawyers; or their political appetite for legislation with surveillance of people’s private communications without prior suspicion is simply so strong that they are willing to risk having the legislation overturned in the inevitable legal challenge before the Court of Justice of the European Union.

The Commission is the “lone wolf” on the legality of general and indiscriminate detection orders (invoking the legally untested “the content is the crime” argument to rationalise why they believe detection orders cannot be targeted). The assessment of the Council Legal Service, however, is broadly identical to previous assessments from the European Data Protection Supervisor (EDPS) and the European Data Protection Board (EDPB) as well as the complementary impact assessment requested by the Civil Liberties Committee (LIBE) of the European Parliament.

Against this backdrop, the 16 July Council text on detection orders has very few amendments to the scope of the Commission’s proposal. Calls (real-time audio communications) are excluded from detection orders, but that is the only limitation made; voice notes are still in.

An attempt is made to make the conditions for issuing detection orders more precise, but the new wording with “objective indications that the service is or will be used for dissemination..” still allows conjectures about the future use of the service. Similarly, it continues to allow orders to be issued on the basis of “foreseeable [risk]” (Article 7.4.a.). This would almost guarantee that detection orders will be used in a general and indiscriminate manner.

One rule for the privileged, another for everyone else

A new recital (recital 12a) has been added to say that non-public communications – whether government or business-related – are excluded. However, the phrasing is vague and does not provide any specific exclusion in the main text. This means that this purported protection for “professional secrecy” is unlikely to be meaningful, and would not protect, for example, journalists, lawyers or doctors using services like Signal or ProtonMail (which are publicly-available services).

According to a footnote, the aim of this to preserve confidential information. This is effectively an admission that people in the EU will be stripped of the possibility to communicate confidentially by the CSAR: only the government and privileged organisations will have the option to escape this.

Protection of encryption abandoned

Whilst a proposal last month from the previous Swedish presidency to fully protect encryption in the CSAR (Article 1.5.) gave us reason for hope, these hopes have been crushed in the latest Spanish text. Whereas the Swedes proposed an article to protect encryption, Spain have offered just a watered-down recital (recital 26). A key criticism from technology experts has been that technologies to scan encrypted messages are not safe, secure, or able to be deployed at scale; the Spanish proposal merely requires “having regard to” the availability of technologies, which will do nothing to stop technologies which will undermine the integrity of encryption and introduce dangerous vulnerabilities from being used.

The new text also requires that measures must not “prohibit” or “make impossible” the use of end-to-end encryption, which some lawmakers have seen as an acceptable protection of encryption. In reality, what the proposed detection orders will do is mandate the use of technologies like Client-Side Scanning or so-called ‘secure enclaves’, which will weaken or circumvent encryption – meaning that the proposed ‘protections’ will not prevent this outcome.

In the UK,where similar measures for scanning of encrypted communications have been proposed in the draft Online Safety Bill, WhatsApp and Signal have both said publicly that they will not implement client-side scanning or any other circumvention of encryption. Ciaran Martin, former chief executive of the UK’s National Cyber Security Centre, publicly questioned the wisdom of adopting legislation that most likely could never be used by UK authorities.

With the Council’s position on encryption, the European Union is set on the same collision course with service providers, including one (Signal) that the European Commission itself presently relies on for its secure communications.

There is no technical way for service providers to implement scanning of encrypted communications that only works in the EU, and cannot be exploited (abused) elsewhere. Like any other weakening of encryption, the security of every user in the world will be undermined.

Blocking orders for Internet Service Providers remain technically infeasible

In Article 16(1), an Internet Service Providers (ISP) can be ordered to block access to unspecified child sexual abuse material. Unlike the Commission proposal, providing the ISP with a list of Uniform Resource Locations (URLs) for known CSAM is entirely optional. Article 16(5) still requires blocking orders to be targeted, but it is highly unclear how this wording can be reconciled with the general blocking provision in Article 16(1). Since almost all web traffic is encrypted, there is no technical way to implement access blocking for specific items of content, e.g. a list of URLs. The ISP cannot inspect the URL because it is contained in encrypted internet packets between the user’s browser and the web server.

The technical impossibility of implementing URL-based blocking orders also applies to the Commission’s proposal, and it should have been considered in the Impact Assessment. However, the Council text makes the problems with blocking orders (even) worse. Blocking orders can be issued for CSAM hosted in the EU, and without attempting to have it removed first. Additionally, the requirement for judicial authorisation is deleted. Together, these amendments could increase the use of blocking orders and paradoxically make the fight against the online proliferation of CSAM less effective because blocking is incentivised over removal at the source.

The same critique applies to the new provision for delisting orders in Articles 18a-18c, where competent authorities can order search engine operators to remove (“delist”) specific URLs where CSAM can be found. Since issuing delisting orders will require considerably less effort for competent authorities than content removal, there is a clear risk that the former will be preferred over the latter. This is regrettable, as only removal is effective in stopping the dissemination of CSAM. The German hotline eco reports a 100% success rate for the removal of German-hosted CSAM and a 98.5% success rate overall. This is a convincing argument against the broad blocking and delisting powers granted to authorities in the Council’s text.

Member States are generally keen to shift enforcement powers from the Coordinating Authority or judicial authorities to competent authorities (such as the police) in other areas of the CSAR. Besides blocking orders, their text allows removal orders and delisting orders to be issued by competent authorities. One of the competent authorities must be appointed as Coordinating Authority, but there are no additional requirements for the Coordinating Authority.

This is a radical change from the Commission’s proposal (Article 26) where the Coordinating Authority must be an independent administrative authority with a central role in all enforcement matters.

Age verification provisions lack meaningful protections for children’s data

We have raised concerns that age verification and assessment measures, which require providers to predict or verify people’s ages in order to access a platform or service, pose serious human rights risks. This includes the mass collection and profiling of children’s data, the destruction of online anonymity, and the exclusion of those without the right identity documents – such as undocumented young people.

In an attempt at a concession to mitigate these risks, the sensitivity of children’s personal data and the need to prevent discrimination (for example, against those without legal documentation) are mentioned as key concerns in the Council text (recital 16a). But so long as age verification and age assessment remain mandatory in the Council’s text (which they do, in Articles 4.3 and 6.1.), it is hard to see how the requirement for only necessary processing and of non-discrimination can be achieved.

Other problems

Elsewhere, the new text introduces other worrying provisions. Vague powers are given to coordinating authorities to issue penalty fines to providers, without making it sufficiently clear what would constitute punishable behaviour (Article 5 a.). A prior attempt from Sweden to disallow (illegal) general monitoring has, furthermore, been deleted (Article 1.4a.). And measures allowing various types of expedited police or pseudo-police investigations (recital 29b, Articles 36 and 38) and increased Europol access (Article 46) are introduced, despite the fact that the CSAR is not supposed to be a policing law, and does not have an appropriate legal basis for policing.

The controversial interim derogation, currently in force, is extended for 54 months (Article 88) to create a transitional period. This comes together with a prolonged implementation period of the CSAR. The date of application is extended to 24 months after the entry into force, and for the detection orders in Articles 7-11, there is a further delay of another two years. After a potentially big public fight over its adoption, the CSAR could be almost forgotten in the public mind before the negative consequences, especially the mass surveillance associated with detection orders, take effect.

The Council have all but ignored fundamental rights

This near-final Council text leaves the three biggest problems of the CSA Regulation completely unsolved (age verification leading to social exclusion, making online anonymity impossible, and risky processing of children’s data; general monitoring / mass surveillance of public and private communications; and undermining encryption).

It leaves a legislative text that cannot be safely implemented by providers, which lacks evidence that it will be effective which relies on technologies that do not exist – and which has a very high chance of being annulled by the Court of Justice for violating the most essential core of all our rights to privacy and data protection.

Resources & what you can do

The Council position won’t become official until 28 September 2023 at the earliest. There’s still time to remind our governments that they must protect our rights to privacy, data protection, freedom of expression and the presumption of innocence. For example, you could:

  • Contact your national elected representatives (MPs) to make them aware of this dangerous proposed law
  • Contact your national Home Affairs, Justice, and Digital ministers to tell them you oppose these measures
  • Raise public attention of what is happening and why it matters
  • Join the Stop Scanning Me campaign