Internal market MEPs wrestle with how to fix Commission’s CSAR proposal

The European Union’s proposed CSA Regulation (Regulation laying down rules to prevent and combat child sexual abuse) is one of the most controversial and misguided European internet laws that we at EDRi have seen. Whilst aiming to protect children, this proposed law from the Commissioner for Home Affairs, Ylva Johansson, would obliterate privacy, security and free expression for everyone online.

By EDRi & IT-Pol (guest author) · April 13, 2023

The European Union’s proposed CSA Regulation (Regulation laying down rules to prevent and combat child sexual abuse) is one of the most controversial and misguided European internet laws that we at EDRi have seen. Whilst aiming to protect children, this proposed law from the Commissioner for Home Affairs, Ylva Johansson, would obliterate privacy, security and free expression for everyone online.

“With this proposal young people and activists will be stripped of the opportunity to find a safe and encrypted space to exchange experiences and to discuss personal and political matters. We don't want to fear that our supposed-to-be private conversations can be turned against us.”

Aura, young activist, Germany

The EDRi-led Stop Scanning Me movement has been calling on the European Commission to suggest alternative, structural approaches to tackling the root of the horrific crime of child sexual abuse, instead of implementing sweeping surveillance measures.

With the legislative process well underway, we now see Members of the European Parliament (MEPs) wrestle with a proposal that – despite the importance of its goal – is poorly written, proposes measures that are ill-suited to tackle the problem at hand, and fails to protect the crucial human right to respect for our private lives online, just as we have offline.

IMCO MEPs take strong steps to protect fundamental rights

In February 2023, lead MEP for the European Parliament’s internal market and consumer protection (IMCO) committee, Alex Saliba (Malta, S&D group), put forward his draft opinion on the CSA Regulation. While he stopped short of calling to withdraw the entire legislation (as EDRi and 130 other NGOs call for), MEP Saliba made positive and constructive steps to mitigate the severe harms of the Commission’s draft proposal, as well as to prevent it from undermining the Digital Services Act (DSA) – which already regulates illegal content online.

Now, other MEPs in the IMCO committee from all political groups have been given the opportunity to weigh in, putting forward (‘tabling’) the amendments that they want to be considered as part of the negotiation process. Where applicable, specific amendment numbers are referenced in square brackets e.g. [123].

Protecting privacy and end-to-end encryption

Fortunately, a key concern for a majority of MEPs who tabled amendments is protecting encryption. Complementing Saliba’s draft, MEPs Hahn, Körner, Kolaja, Konečná, Lacapelle and Bielan (collectively representing 6 of the 7 European political groups) made it clear that they do not want to see encryption undermined.

Furthermore, across the political spectrum, all the MEPs who put forward amendments called to limit the proposed measures in order to safeguard the fundamental rights of all internet users. Many of them expressly recognised the intrusiveness of the proposed measures and the importance of protecting the right to privacy and confidentiality of communications. MEPs Hahn and Körner, in particular, are unequivocal that the human rights tool of encryption must be fully protected, and that the regulation cannot do anything that would amount to general monitoring.[278]

In the case of MEP Konečná from The Left, the need to safeguard fundamental rights extended to rightfully calling on the Commission to return to the drawing board with a better attempt [158], which she specifically outlines. [184, 186] In general, our assessment is that the MEPs gave an important boost to the direction of Saliba’s report, which tries to better align the CSAR to EU human rights rules.

The MEPs were also unanimous in calling for more provisions to empower users and give them control over their online lives, to ensure that people can report issues easily and in a child-friendly way. As feminist technology researchers have pointed out, the Commission’s proposal risks being paternalistic and disempowering, and should be replaced with measures that put all of us as internet users in control of our digital lives.

Age verification: a problematic measure

Many of the MEPs also made amendments to prevent age verification measures from being made mandatory; whereas others tried – although not always successfully – to safeguard age verification. We think that these amendments will allow Saliba to raise awareness of the fact that age verification is not always a positive measure, and when pursued, must be done in a carefully-considered, controlled way, in order to avoid harming the very young people this regulation is supposed to protect. In EDRi’s assessment, all currently available technologies for age verification have serious problems with data protection, accuracy and discrimination, as well as clear risks of digital exclusion for young people online.

Detection orders

One key point of divergence among the MEPs is their different interpretations of what “targeted” means. This is one of the existential questions of this Regulation. Critics like EDRi and the European Data Protection Supervisor have pointed out that the law takes an overly broad approach, and it is unacceptable that the law would restrict the rights and freedoms of innocent internet users just as much as suspected criminals.

The majority of MEPs who tabled amendments also take a pro-fundamental rights stance on this question, including The Left’s MEP Konečná [207] and Green MEP Kolaja, who tabled over 50 amendments to overhaul the disproportionate ‘detection orders’, replacing them with limited and targeted ‘investigation orders’.

These orders are limited to where there is genuine suspicion, instead of treating huge swathes of the population as suspects without due cause. The draft opinion from MEP Saliba also limits detection orders to known child sexual abuse material (CSAM). This avoids detection orders based on highly unreliable AI technologies for unknown material and grooming. MEP Kolaja’s amendments would seem to help where MEP Saliba’s opinion falls short of ensuring that detection orders are targeted based on prior individual suspicion.

Boldest of all were the Renew group’s MEPs Hahn and Körner, who recognised that in the context of this regulation, the “targeting” proposed is not acceptable, because by nature it cannot be genuinely targeted, and technical tools seeking to do so are deeply fallible. They called to delete detection orders entirely.

Their group’s lead MEP, Catharina Rinzema, however, made an unsuccessful attempt at a middle ground, supported by several of her colleagues in the Renew group. In her amendments, Detection orders should be a measure of last resort where voluntary mitigation measures have proven unsuccessful. [AM 399] The problem with this approach is that it incentivises voluntary detection measures by service providers, even measures whose level of intrusion may come close to that of detection orders, without adequate safeguards provided for by law.

The issue of metadata

Most troubling of the lot are several of the amendments put forward by MEPs Walsmann and Štefanec (EPP group). For example, the pair propose to extend the regime to introduce a legal basis for scanning metadata [334, 360]. Metadata means information about messages (who you message, how often, and other identifying data) and under EU law is very sensitive.

MEPs Walsmann and Štefanec also propose that encrypted services should be forced to scan all their users’ metadata [387]. This would mandate pro-privacy and pro-data protection services, like Signal and Proton, to collect data on their users where they currently do not. This is a dangerous idea because it would force these providers to collect huge amounts of sensitive data on every single person that relies on their services – journalists, human rights defenders, lawyers, politicians, people seeking healthcare, people living under authoritarian regimes.

This would also undermine the EU’s push in other pieces of legislation to protect people from data-hungry platforms who try to hoover up data about their users for commercial or even surveillance purposes. What’s more, this general and indiscriminate processing of metadata for electronic communications would be in conflict with rulings of the Court of Justice on data retention.

What happens next?

All in all, these amendments highlight that MEPs are wrestling with this complex legislation, but that there is a strong appetite for improvement. Where does this leave us? We believe that Alex Saliba has an almost unanimous basis to push for protecting encryption, as well as a strong basis for his other key reforms: drastically improving Detection Orders, ensuring that age verification measures are privacy- and data-protecting (and not mandatory), and making sure that the obligations on digital service providers are reasonable in a democratic society. This will feed into the overall position of the European Parliament, which is being led by MEP Javier Zarzalejos (EPP, Spain) in the Civil Liberties (LIBE) committee.

Expand the boxes below to find out in more detail what the IMCO MEPs think and how they would see the CSA Regulation amended on key topics.

By Ella Jakubowska, Senior Policy Officer, EDRi & Jesper Lund, Chairman of EDRi member IT-Pol