LIBE lead MEP fails to find silver bullet for CSA Regulation
On 19 April 2023, the lead MEP on the proposed CSA Regulation, Javier Zarzalejos (EPP), published his draft report. Whilst we agree with MEP Zarzalejos about putting privacy, safety and security by design at the heart, many of his changes may pose a greater risk to human rights online than the European Commission’s original text.
On 19 April 2023, the lead MEP on the proposed CSA Regulation, Javier Zarzalejos (EPP), published his draft report. The report outlines Zarzalejos’ key edits to probably the most controversial EU proposal since the 2006 Data Retention Directive, and provides a basis for other MEPs in the LIBE (civil liberties) committee to put forward their changes by 17 May.
Whilst we agree with MEP Zarzalejos about putting privacy, safety and security by design at the heart, many of his changes may pose a greater risk to human rights online than the European Commission’s original text. Seemingly positive changes – like limiting Detection Orders and safeguarding end-to-end encryption (E2EE) – in fact seem to conceal equally dangerous measures.
That’s because Zarzalejos’ report strongly encourages (or even coerces) “voluntary” mass scanning, a as a general risk mitigation measure, rather than targeting measures on the basis of reasonable suspicion. As long as such practices remain, the CSA Regulation will be incompatible with the prohibition of general monitoring and rules on data retention.
Risk assessment and mitigation (articles 3 and 4)
Zarzalejos’ report focuses on privacy, safety and security by design (AMs 10, 79). This is probably the most feasible approach for the EU to take, and one which is likely to find broad consensus. In a democratic society, we cannot expect – and do not want – sensitive law enforcement duties to be privatised to online service providers.
It is legitimate, therefore, to require providers to build their platforms and services with people’s privacy, safety and security at the heart. This will mean not only empowering people to have effective and accessible ways to report potential CSA and other potentially illegal behaviour (e.g. AM 10), but also ensuring that children and adults alike have safe, trustworthy digital spaces in which they can seek information and express themselves, free from unwarranted snooping.
On the other hand, there is a problem with Zarzalejos’ approach to safety by design: he points to age verification/assurance methods as a universal solution for online safety (AMs 10, 61). In fact, as EDRi warns, these tools can entail the large-scale processing of young people’s most sensitive data by companies, as well as risking digital exclusion. Any age assessment tools should therefore be approached with caution to avoid harmful repercussions. Regrettably, Zarzalejos allows these measures to remain mandatory.
Voluntary detection (articles 4 and 5a)
Probably the most misguided part of the draft report is the creation of a voluntary scanning regime (AMs 11, 99). The logic here is to allow existing scanning practices to continue with a new requirement of judicial authorisation.
Unfortunately, the system that the draft report creates would likely be the worst of all worlds. By strongly encouraging providers to scan content or metadata under Article 4 and new article 5a of the CSA Regulation, such “voluntary” measures would become de facto obligations.
That’s because, in order to be seen as cooperative, and to avoid being served with a (real) detection order, providers will logically take the heaviest (and therefore most intrusive) measures that they are permitted to do under Article 4.
Neither the (already insufficient) safeguards proposed for article 7’s detection orders, nor the current (also insufficient) safeguards for voluntary detection measures in the interim Regulation are required by law in the proposal by Zarzalejos.
As such, Zarzalejos has suggested changes which would circumvent almost the entire logic of the CSA Regulation, encouraging (arguably even coercing) the mass scanning of digital communications, with even fewer safeguards for individuals.
Encryption and metadata
Whilst MEP Zarzalejos makes a very positive step in recognising the importance of protecting encryption (AMs 17, 106, 133, 135), there are two issues in how the draft report approaches encryption – meaning that some of the CSAR’s biggest issues are moved around, but not solved.
The first is that the wording is still potentially ambiguous. It is great that the report outlaws the “prohibiting or weakening” of encryption, and requires scanning technology to not “weaken” E2EE or “hamper [its] integrity.” But we would urge the legislators to strengthen the wording to also expressly prohibit detection measures on users’ devices (client-side scanning). This is particularly important because the Commission, for example, argues that client–side scanning does not “prohibit” or “weaken” encryption (rather, it renders it meaningless).
Secondly, the report proposes that instead of scanning the content of E2EE messages, providers of such services would be required to scan metadata (AMs 17, 106). This idea, if implemented in a very careful way, is not without merits. However, Zarzalejos’ amendments still create some very serious problems:
- Metadata can be extremely sensitive, with the European Court of Human Rights (ECHR) and the Court of Justice of the EU (CJEU) finding it as sensitive as the content of private communications. The generalised scanning of most forms of metadata would therefore be just as unacceptable as the indiscriminate scanning of content. In Zarzalejos’ draft report, there is no requirement for the scanning of metadata to be targeted, meaning it can be done in a general and indiscriminate way (AM 106);
- Some privacy-focused services are set up to protect their users’ metadata, much like how they protect content. Such services would therefore be unable to scan metadata without changing the fundamental design of their service, thereby lowering the levels of security and protection of the confidentiality of communications for everyone that relies on them.
So whilst – when done on the basis of reasonable suspicion – the genuinely targeted scanning of metadata might provide a partial solution to one of the challenges posed by the CSA Regulation (the need to protect E2EE communications), it is by no means a silver bullet. In the current wording, Zarzalejos may have simply swapped out one security and surveillance problem for another.
Detection Orders (articles 7-11)
Whilst it seems positive that Zarzalejos has limited detection orders to being a measure of last resort (AM13), this whole move is ineffective, given the addition of voluntary detection, as discussed above. What’s more, his report defines a “last resort” as simply meaning not implementing “voluntary” measures: “when the provider refuses to cooperate by putting in place the mitigating measures aimed to limit the risk” (AM 13).
For providers that cannot or will not scan because of the unjustified intrusion into the private lives of legitimate users and the risk it poses to people that rely on the service (i.e. privacy and security-focused services), they will receive a detection order. As such, this regime is neither truly voluntary, nor truly a measure of last resort.
What’s more, even the proposed limitation of detection orders does not go far enough, because whilst Zarazalejos does call for them to be targeted and proportionate (AM 3), this is not accompanied by a sufficient limitation of the scope. In the recitals, he explains that orders should be “related to the specific service, users or group of users” (AM 15). The “or” here is very important: it means that an order can still be issued to a whole service, and that targeting to specific users or groups is merely optional.
The main articles similarly lack clear limitations: the issuing authority must “limit the detection order to an identifiable part or component of a service, such as a specific channel of communication or a specific group of users identified with particularity for which the significant risk has been identified” (AMs 128, 129). However, this still does not reach the requirement for specific individualised suspicion, nor does it prevent orders from being issued in a generalised way. Forthcoming research from the University of Amsterdam will emphasise that limitation to a part or component of a service is not sufficient to be considered ‘targeted.’
In addition, whilst experts in children’s online rights have called for new material to be deleted from scope of detection due to the risk such scanning poses to young people, and police have argued that grooming detection would make the fight against CSA even more difficult, MEP Zarzalejos has, however, left both forms of content scanning in scope.
What is the broader context?
Zarzalejos’ draft report comes at a tumultuous time: whilst the complementary impact assessment produced for the LIBE committee by independent experts concluded that the proposal creates big risks of mass surveillance and is likely incompatible with EU human rights rules, Zarzalejos has been vocal in dismissing these concerns. In a recent interview with POLITICO, he stated that “there is no general monitoring obligation in this regulation, and certainly not in my report.” The Commission incidentally makes the same claim about its proposal, with Commissioner Johansson telling POLITICO that “my proposal is very proportionate.”
These claims are deeply worrying. In contradiction to the Parliament’s own independent experts, the EDPS and EDPB, the Commission’s Regulatory Scrutiny Board (RSB), EDRi, reportedly the legal service of the Council of the EU, and countless others, Zarzalejos does not address some of the proposal’s most problematic parts.
What does the Court say?
The EU Court of Justice (CJEU) has been very clear on issues of mass surveillance, in particular:
- The generalised interception (i.e. not on the basis of specific, individual suspicion) of people’s private communications is a violation of the essence of fundamental rights to privacy and data protection, so can never be justified;
- The generalised retention of communications metadata can only be justified for national security purposes or in very specific cases for less sensitive metadata, notably the source IP address. Child sexual abuse is a very serious crime, but does not reach the threshold of national security.
Unless MEP Zarzalejos can further limit the scope of Detection Orders to ensure that they will not disproportionately impact innocent users, it is likely that the detection orders in CSA Regulation would be annulled by the Court of Justice. This would take the EU right back to square one on this issue, leaving everyone unhappy.
- EDRi’s blog on the amendments put forward by MEPs in the internal markets (IMCO) committee
- Committee on Civil Liberties, Justice and Home Affairs draft report by Javier Zarzalejos
- The original European Commission proposal
- The LIBE committee’s complementary impact assessment criticising the proposal
- 134 NGOs call on EU to withdraw the proposal