Internal market MEPs wrestle with how to fix Commission’s CSAR proposal
The European Union’s proposed CSA Regulation (Regulation laying down rules to prevent and combat child sexual abuse) is one of the most controversial and misguided European internet laws that we at EDRi have seen. Whilst aiming to protect children, this proposed law from the Commissioner for Home Affairs, Ylva Johansson, would obliterate privacy, security and free expression for everyone online.
The European Union’s proposed CSA Regulation (Regulation laying down rules to prevent and combat child sexual abuse) is one of the most controversial and misguided European internet laws that we at EDRi have seen. Whilst aiming to protect children, this proposed law from the Commissioner for Home Affairs, Ylva Johansson, would obliterate privacy, security and free expression for everyone online.
“With this proposal young people and activists will be stripped of the opportunity to find a safe and encrypted space to exchange experiences and to discuss personal and political matters. We don't want to fear that our supposed-to-be private conversations can be turned against us.”
The EDRi-led Stop Scanning Me movement has been calling on the European Commission to suggest alternative, structural approaches to tackling the root of the horrific crime of child sexual abuse, instead of implementing sweeping surveillance measures.
With the legislative process well underway, we now see Members of the European Parliament (MEPs) wrestle with a proposal that – despite the importance of its goal – is poorly written, proposes measures that are ill-suited to tackle the problem at hand, and fails to protect the crucial human right to respect for our private lives online, just as we have offline.
IMCO MEPs take strong steps to protect fundamental rights
In February 2023, lead MEP for the European Parliament’s internal market and consumer protection (IMCO) committee, Alex Saliba (Malta, S&D group), put forward his draft opinion on the CSA Regulation. While he stopped short of calling to withdraw the entire legislation (as EDRi and 130 other NGOs call for), MEP Saliba made positive and constructive steps to mitigate the severe harms of the Commission’s draft proposal, as well as to prevent it from undermining the Digital Services Act (DSA) – which already regulates illegal content online.
Now, other MEPs in the IMCO committee from all political groups have been given the opportunity to weigh in, putting forward (‘tabling’) the amendments that they want to be considered as part of the negotiation process. Where applicable, specific amendment numbers are referenced in square brackets e.g. [123].
Protecting privacy and end-to-end encryption
Fortunately, a key concern for a majority of MEPs who tabled amendments is protecting encryption. Complementing Saliba’s draft, MEPs Hahn, Körner, Kolaja, Konečná, Lacapelle and Bielan (collectively representing 6 of the 7 European political groups) made it clear that they do not want to see encryption undermined.
Furthermore, across the political spectrum, all the MEPs who put forward amendments called to limit the proposed measures in order to safeguard the fundamental rights of all internet users. Many of them expressly recognised the intrusiveness of the proposed measures and the importance of protecting the right to privacy and confidentiality of communications. MEPs Hahn and Körner, in particular, are unequivocal that the human rights tool of encryption must be fully protected, and that the regulation cannot do anything that would amount to general monitoring.[278]
In the case of MEP Konečná from The Left, the need to safeguard fundamental rights extended to rightfully calling on the Commission to return to the drawing board with a better attempt [158], which she specifically outlines. [184, 186] In general, our assessment is that the MEPs gave an important boost to the direction of Saliba’s report, which tries to better align the CSAR to EU human rights rules.
The MEPs were also unanimous in calling for more provisions to empower users and give them control over their online lives, to ensure that people can report issues easily and in a child-friendly way. As feminist technology researchers have pointed out, the Commission’s proposal risks being paternalistic and disempowering, and should be replaced with measures that put all of us as internet users in control of our digital lives.
Age verification: a problematic measure
Many of the MEPs also made amendments to prevent age verification measures from being made mandatory; whereas others tried – although not always successfully – to safeguard age verification. We think that these amendments will allow Saliba to raise awareness of the fact that age verification is not always a positive measure, and when pursued, must be done in a carefully-considered, controlled way, in order to avoid harming the very young people this regulation is supposed to protect. In EDRi’s assessment, all currently available technologies for age verification have serious problems with data protection, accuracy and discrimination, as well as clear risks of digital exclusion for young people online.
Detection orders
One key point of divergence among the MEPs is their different interpretations of what “targeted” means. This is one of the existential questions of this Regulation. Critics like EDRi and the European Data Protection Supervisor have pointed out that the law takes an overly broad approach, and it is unacceptable that the law would restrict the rights and freedoms of innocent internet users just as much as suspected criminals.
The majority of MEPs who tabled amendments also take a pro-fundamental rights stance on this question, including The Left’s MEP Konečná [207] and Green MEP Kolaja, who tabled over 50 amendments to overhaul the disproportionate ‘detection orders’, replacing them with limited and targeted ‘investigation orders’.
These orders are limited to where there is genuine suspicion, instead of treating huge swathes of the population as suspects without due cause. The draft opinion from MEP Saliba also limits detection orders to known child sexual abuse material (CSAM). This avoids detection orders based on highly unreliable AI technologies for unknown material and grooming. MEP Kolaja’s amendments would seem to help where MEP Saliba’s opinion falls short of ensuring that detection orders are targeted based on prior individual suspicion.
Boldest of all were the Renew group’s MEPs Hahn and Körner, who recognised that in the context of this regulation, the “targeting” proposed is not acceptable, because by nature it cannot be genuinely targeted, and technical tools seeking to do so are deeply fallible. They called to delete detection orders entirely.
Their group’s lead MEP, Catharina Rinzema, however, made an unsuccessful attempt at a middle ground, supported by several of her colleagues in the Renew group. In her amendments, Detection orders should be a measure of last resort where voluntary mitigation measures have proven unsuccessful. [AM 399] The problem with this approach is that it incentivises voluntary detection measures by service providers, even measures whose level of intrusion may come close to that of detection orders, without adequate safeguards provided for by law.
The issue of metadata
Most troubling of the lot are several of the amendments put forward by MEPs Walsmann and Štefanec (EPP group). For example, the pair propose to extend the regime to introduce a legal basis for scanning metadata [334, 360]. Metadata means information about messages (who you message, how often, and other identifying data) and under EU law is very sensitive.
MEPs Walsmann and Štefanec also propose that encrypted services should be forced to scan all their users’ metadata [387]. This would mandate pro-privacy and pro-data protection services, like Signal and Proton, to collect data on their users where they currently do not. This is a dangerous idea because it would force these providers to collect huge amounts of sensitive data on every single person that relies on their services – journalists, human rights defenders, lawyers, politicians, people seeking healthcare, people living under authoritarian regimes.
This would also undermine the EU’s push in other pieces of legislation to protect people from data-hungry platforms who try to hoover up data about their users for commercial or even surveillance purposes. What’s more, this general and indiscriminate processing of metadata for electronic communications would be in conflict with rulings of the Court of Justice on data retention.
What happens next?
All in all, these amendments highlight that MEPs are wrestling with this complex legislation, but that there is a strong appetite for improvement. Where does this leave us? We believe that Alex Saliba has an almost unanimous basis to push for protecting encryption, as well as a strong basis for his other key reforms: drastically improving Detection Orders, ensuring that age verification measures are privacy- and data-protecting (and not mandatory), and making sure that the obligations on digital service providers are reasonable in a democratic society. This will feed into the overall position of the European Parliament, which is being led by MEP Javier Zarzalejos (EPP, Spain) in the Civil Liberties (LIBE) committee.
Expand the boxes below to find out in more detail what the IMCO MEPs think and how they would see the CSA Regulation amended on key topics.
We strongly welcome the amendment from MEPs Hahn and Körner (Renew) to fully protect the human rights tool of encryption [192, 274]. We also welcome the move from MEP Lacapelle (ID) to recognise that the encryption of services cannot be considered as something that makes them risky [309], and that they cannot be required to compromise encryption [347].
Whilst their wording – which is based on MEP Saliba’s draft – could be slightly amended to eliminate any ambiguity, MEP Konečná (The Left), MEP Bielan (ECR) and MEP Kolaja (Greens) also add their names to the strong call for the full protection of encryption [189, 190, 385, 386, 388]. We would also find it important that the protection of encryption in the Regulation includes clear wording that prohibits measures which would undermine end-to-end encryption through, for example, client-side scanning.
MEPs Walsmann and Štefanec (EPP), however, put forward a problematic stance on encryption [387]. They explain that EU Member States cannot prevent providers from offering encrypted services. This misses the point that the CSAR doesn’t stop providers from offering encrypted services, but requires them to make changes which would fundamentally alter and undermine that encryption. What’s more, they say that if providers offer encrypted services, then they must scan metadata [387]. We warn that providers would be stuck between a rock of invading the contents of their users messages and a hard place of invading their privacy by processing a lot of sensitive data about their use, patterns and connections, in both cases without any requirements of prior individual suspicion.
They also make an untrue statement in the recitals that methods for detecting online child sexual abuse in end-to-end encryption communications already exist [211]. This could pave the way for detection orders being issued to end-to-end encrypted service providers, where they will be expected to come up with technologies that are not safe or reliable. Such “magical thinking”, as President of the Signal Foundation, Meredith Whittaker puts it, vastly over-states what technology is able to do. Yet it has been at the core of repeated misleading claims about technology by Commissioner Johansson and Ashton Kutcher, which here seem to have reached MEPs.
A cluster of Renew MEPs led by MEP Catharina Rinzema propose in a recital that the regulation must be “in full accordance and respect of encryption” [213]. They do, however, want continued research and “innovation”, and it must be ensured that this does not circumvent encryption.
Later, and along with MEP Bielan (ECR), MEP Rinzema et al echo the amendment from MEP Saliba (S&D) which calls for the “protection of encryption […] where applicable” [519, 524]. This is a potentially ambiguous amendment which could be read either as any time encryption is used, it must be protected (which we believe is the intention); or as encryption is only protected where the protection is applicable. This ambiguity needs to be corrected to show that it means that encryption must always be protected.
We conclude that the majority of MEPs support Saliba’s strong protection of encryption, and that there is a lot of evidence to suggest that this will be upheld in the final IMCO opinion. Where there are divergent amendments, we hope that this can be resolved by helping those MEPs to better understand how encryption technologies work, and why it is so important to protect them.
MEPs Hahn and Körner (Renew) on the one hand, and MEP Kolaja (Greens) on the other, have taken different approaches to tackle the serious risks of Detection Orders (Articles 7-11), mitigating what is arguably the worst part of the entire CSA Regulation.
MEPs Hahn and Körner have gone for the effective strategy of outright deletion [390, 392, 458, 485, 503, 530], whereas Kolaja has transformed the generalised scanning of Detection Orders into a more limited and controlled approach, which he calls “investigation orders” [391 et al]. More than 50 amendments together constitute this significant shift by Kolaja, which seeks to ensure that the CSAR is in line with the right to the presumption of innocence. therefore restricting the privacy only of those that are reasonably suspected of spreading or committing CSA. Kolaja also shows his technical mettle in asking for reports of both false positives and false negatives [644].
We recommend that Kolaja’s proposal could be improved with better DPIA requirements [412], by ensuring no risk of misuse of data [446], and deleting references to solicitation scanning, which is a fundamentally unsuitable technology [494, 508, 514, 633 et al]. We also find Kolaja’s proposal for ‘preservation orders’ interesting, but would recommend strict safeguards to ensure that this is limited and well-controlled [483, 547]. More safeguards would be needed, for example, to protect services that do not collect their users data, to ensure that they cannot be forced to do so.
MEP Lacapelle (ID) chooses the approach of safeguarding Detection Orders. Although we argue that he does not go far enough to address the key harms, his changes are still significantly better than the Commission’s text. Lacapelle requires Data Protection Impact Assessments (DPIAs) to be carried out, as well as prior consultation with data protection authorities [411], and a requirement to consider the fundamental rights of all users [423].
He also calls for the full respect of the confidentiality of communications [509] and calls out the Commission’s violation of the principle of personality [436, 441,444]. However, we do not understand Lacapelle’s logic in deleting article 10, which would remove the Commission’s (admittedly inadequate, but better-than-nothing) safeguards [502].
MEP Konečná (The Left) calls to target Detection Orders “to the greatest extent possible”, but could go further to ensure that this is always the case [445], for example through MEP Kolaja’s idea of ‘investigation orders’.
MEP Bielan (ECR) makes several steps to limit Detection Orders, making them a measure of last resort, excluding cloud services, and limiting them to only known material [394]. He also calls for them to be “strictly targeted” [200]. However, MEP Bielan notes that limiting the orders to specific users is a possibility (rather than being mandatory). Under EU fundamental rights law, this should be mandatory in order to avoid constituting mass surveillance. MEP Konečná (The Left) adds similar limitations, but without the exception for cloud services [395].
On the topic of mass surveillance, MEPs Walsmann and Štefanec (EPP) and Rinzema et al (Renew) leave the Commission’s Detection Orders largely untouched, other than some small but ultimately meaningless attempts at safeguards [398, 402, 399, 515]. MEP Rinzena et al (Renew) even add a new condition that Detection Orders will apply until providers can prove that they have eliminated risk [452]. This is a very flawed amendment, which fails to grasp that risk can never be fully eliminated, and that asking providers to eliminate it is not realistic.
In addition to Detection Orders, MEPs Walsmann and Štefanec (EPP) and Rinzema et al (Renew) table the idea of voluntary detection as their preferred measure, meaning that providers can scan the content of online content and private messages at their own discretion. As they call for Detection Orders to be a last resort, this voluntary scanning would likely become the main method of detection [292, 316, 337, 346]. This attempt to find some sort of ‘middle ground’ is misguided, because it constitutes the same level of infringement on the fundamental rights of users – without safeguards such as needing judicial authorisation. It also undermines the key argument that scanning messages is necessary: if it were so, the measures would have to be mandatory.
In sum, with the exception of MEPs Walsmann and Štefanec and Rinzema et al, we see strong support to strictly limit Detection Orders, most likely to known material only and for specific users, as well as to reduce their scope and increase their safeguards.
As the amendments on risk assessment (the part of the CSA Regulation which is likely to entail mandatory age verification) are wide-ranging and diverse, we first of all want to emphasise that the best approach to risk assessment is likely to be to give providers set criteria that they must meet, without forcing them to resort to certain technical measures (e.g. age verification).
Furthermore, it should be mandatory for their proposed mitigation plan to be approved in advance (ex ante) by their national Data Protection Authority. Requiring providers to undergo such a DPIA (data protection impact assessment) process would ensure that their measures are tailored, appropriate, necessary and proportionate – rather than the Commission’s one-size-fits-all approach. Whilst no single MEP tabled such a change, we believe that it would be supported by many of the amendments on the table when read together.
We welcome the proposal by MEP Kolaja (Greens) to ensure strict and well-controlled risk mitigation measures with a more realistic approach to risk [320, 341, 343, 345, 365]. MEP Koloja proposes that the risk mitigation obligation should only apply for online services that are exposed to substantial amounts of child sexual abuse material. This follows the spirit of the Terrorist Content Online Regulation (EU) 2021/784, where specific measures – the equivalent of risk mitigation measures in the CSAR – only apply to services that are exposed to terrorist content. Kolaja’s amendments will save a large number of smaller online services (including, for example, small Mastodon instances run by volunteers) from a considerable administrative burden, because the risk mitigation measures would be targeted towards the online services that genuinely have problems with CSAM.
MEP Kolaja also emphasises that risk mitigation measures must not amount to general monitoring, and must never oblige providers to seek illegal material (in line with the DSA) [355]. Some of the same ideas are also put forward by MEP Bielan (ECR) [321, 344] and MEP Grapini (S&D) [342]. MEP Lacapelle (ID) has also strengthened the risk assessment safeguards by clarifying that nothing in the risk process can amount to accessing private communications nor compromising their encryption [347], showing his strong understanding of the risk that Articles 3 and 4 of the CSAR might – without clarification of the original text – encourage generalised surveillance.
When it comes to age verification specifically, several MEPs (Kolaja, Hahn, Körner) have recognised that it is not a silver bullet solution to issues of online safety, recommending the deletion of text about age verification in the search for solicitation [348, 349], as well as a general move away from mandatory age verification by Kolaja and Konečná [289. 290, 352].
These four MEPs additionally call to remove the unrealistic obligations on app stores in Article 6, which in the Commission’s text, risk locking young people out of digital communication services [375, 376, 378, 380, 382, 383]. These changes are important because whilst there may be some cases where age verification is an appropriate measure – as already allowed under the General Data Protection Regulation (GDPR) – making it mandatory for a large number of services, notably all interpersonal communications services, under the CSAR could do more harm than good.
MEP Rinzema et al (Renew) do not remove mandatory age verification, but suggest some safeguards which would be useful if the requirements were not mandatory [315].
We are deeply concerned that MEPs Catharina Rinzema, Morten Løkkegaard, Jordi Cañas and Sandro Gozi (Renew) propose the use of what would amount to ‘upload filters’ , which could have a severe chilling and censorship effect on legitimate free expression [292]. The same group of MEPs also propose similar measures to prevent “uploads from the dark web” [294]. Their amendment offers no definition of the ‘dark web’, Besides being unworkable in practice due to fundamental problems of singling out traffic from the ill-defined ‘dark web’, the amendment is highly problematic because it is oblivious to the fact that access to, for example,. social media services from the Tor network is critical for people living in oppressed states where social media, and other services that support freedom of expression, are routinely blocked. Criminalising the ‘dark web’ will therefore have severe repercussions on many legitimate users.
We welcome the proposal from MEPs Hahn and Körner to ensure that there are mandatory and sufficiently-staffed “user notification mechanism[s]” [374], which we also see proposed to some extent from MEPs Walsmann and Štefanec (EPP) [160]. Such a change would ensure that young people who are affected or discover abusive material will always have a way to make reports – something that today is sorely lacking. Other measures to protect children by empowering them (rather than surveilling them) are seen in, for example, the call from MEP Kolaja (Greens) to support and educate young people [329, 331, 332].
On the other hand, MEP Bielan (ECR) takes steps that are presumably intended to protect children, but rather than recognising that children have rights on the internet too, takes a sledgehammer approach. In one amendment [308], Bielan wants messaging services that allow children to communicate privately to be considered inherently risky. This is a disproportionate move, which could lock vulnerable young people out of vital lifelines and support.
In another, he calls for parental oversight and control, without recognising the times where this can be harmful to young people – especially adolescents [327]. A similar error is made by Rinzema et al (Renew) when they refer to parental control measures as “safety by design tools”, failing to recognise the risks of parental control measures for some young people [523]. MEPs Walsmann and Štefanec (EPP) at least try to address this when they call for “age appropriate parental control tools” [328].
We also note that MEPs Walsmann and Štefanec (EPP) suggest obligations on search engines to delist content (remove it from search results) [243, 263, 265]. Whilst this is not a bad amendment per se, and when used properly to protect victims could be seen as a natural extension of the right to forgotten in search engines established through case law of the Court of Justice of the EU, we advise that removal remains the most effective method of stopping the spread of CSAM. It would be wise to ensure that delisting can only happen in the event that all other avenues are exhausted first. For survivors of CSA, the most important thing is that the content is gone, not that it is merely blocked for EU users. Clearly, the same rationale applies to blocking orders more broadly, which the IMCO MEPs have not tabled amendments on (as blocking orders are outside IMCO competences).
We strongly support MEP Kolaja (Green)’s amendment for reporting only when there is actual knowledge of CSA, which is in line with the DSA, and to have an obligation for takedown of content [533]. MEPs Hahn and Körner (Renew) also push to align reporting to the DSA’s measures, which again would better ensure regulatory alignment and avoid duplication [553, 556]. These positive amendments, however, risk being undermined by MEP Rinzema et al (Renew) who want to encourage providers to seek knowledge of illegal material [563], potentially in contradiction of the DSA.
- We welcome the suggestion from most of the MEPs to limit the scope of interpersonal communications providers to only “number-independent” ones, effectively removing SMS/MMSand phone calls from the scope;
- We welcome the proposal from MEP Konečná (The Left) to bring in a stronger role for the European Data Protection Board (EDPB) [318];
- We criticise the proposal from MEP Bielan (ECR) to explicitly bring audio content into scope, which would amount to mass wiretapping [241];
- We support the logic from MEP Bielan (ECR) to exclude cloud services from detection orders, given that cloud providers usually do not (and rightly should not) have access to their customers’ cloud data [276, 321]. Just like application stores, providers of cloud infrastructure are generally ill-suited to assess and mitigate potential risks of use of their services for child sexual abuse. This amendment is also consistent with recital 27 of the DSA which says that requests for involvement “should, as a general rule, be directed to the specific provider that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects on the availability and accessibility of information that is not illegal content”;
- We criticise the proposal from MEPs Walsmann and Štefanec to add live stream and live transmission [440], which would entail the use of even more invasive and even less effective tools;
- In dozens of amendments, MEPs Walsmann and Štefanec (EPP) and Lacapelle (ID) remove the possibility for independent administrative authorities to administer Detection Orders, instead leaving this role solely to judicial authorities. While we understand what is being attempted here, we warn that the concept of judicial authorities includes prosecutors which are not independent authorities in most EU Member States. Therefore the best wording would be “independent judicial authority” or “courts”;
- We welcome the attempt by MEPs Rinzema et al (Renew) and MEPs Walsmann and Štefanec (EPP) to ensure that the rules are not overly burdensome for SMEs [373, 389]. However, in the context of the CSA Regulation, such measures are a small concession to a proposal that overall puts burdens on providers to do a job that should be reserved for law enforcement agencies – for example, searching for evidence of grooming;
- We are concerned that MEPs Lacapelle (ID) and Bielan (ECR) want to remove the requirements for the EU Center to be independent and to have its scope limited to what is set out in the Regulation[591, 592, 594, 595]. We support MEP Kolaja (Green) putting in extra safeguards for the Center [606, 609, 610];
- We support the move by many MEPs to give a stronger role to national hotlines [628, 631, 629, 630 et al].
- Read the full set of amendments put forward by MEPs in the IMCO Committee:
-
Read EDRi’s position paper on the CSA Regulation
By Ella Jakubowska, Senior Policy Officer, EDRi & Jesper Lund, Chairman of EDRi member IT-Pol