EU Parliament committee rejects mass scanning of private and encrypted communications

On 14th November, Members of the European Parliament’s ‘Civil Liberties’ committee voted against attempts from EU Home Affairs officials to roll out mass scanning of private and encrypted messages across Europe. It was a clear-cut vote, with a significant majority of MEPs supporting the proposed position.

By EDRi · November 14, 2023

On 14th November, Members of the European Parliament’s ‘Civil Liberties’ committee voted against attempts from EU Home Affairs officials to roll out mass scanning of private and encrypted messages across Europe. It was a clear-cut vote, with a significant majority of MEPs supporting the proposed position.

A political deal struck by the Parliament’s seven political groups at the end of October meant that this outcome was expected. Nevertheless, this is an important and welcome milestone, as Parliamentarians demand that EU laws are based in objective evidence, scientific reality and with respect for human rights law.

This vote signals major improvements compared to the Commission’s original draft law (coined ‘Chat Control’), which has courted controversy. The process around the legislation has faced allegations of conflicts of interest and illegal advert micro-targeting, and rulings of “maladministration”. The proposal has also been widely criticised for failing to meet EU requirements of proportionality – with lawyers for the EU member states making the unprecedented critique that the proposal likely violates the essence of the right to privacy.

In particular, today’s vote shows the strong political will of the Parliament to remove the most dangerous parts of this law – mass scanning, undermining digital security and mandating widespread age verification. Parliamentarians have recognised that no matter how important the aim of a law, it must be pursued using only lawful and legitimate measures.

At the same time, there are parts of their position which still concern us, and which would need to be addressed if any final law were to be acceptable from a digital rights point of view. Coupled with mass surveillance plans from the Council of member states and attempts from the Commission to manipulate the process, we remain sceptical about the chances of a good final outcome.

Civil liberties MEPs also voted for this position to become the official position of the European Parliament. On 20th November, the rest of the house will be notified about the intention to permit negotiators to move forward without an additional vote. Only after that point will the position voted on today be confirmed as the European Parliament’s mandate for the CSA Regulation.

Mass scanning (detection orders)

The European Parliament’s position firmly rejects the premise that in order to search for child sexual abuse material (CSAM), all people’s messages may be scanned (Articles 7-11). Instead, MEPs require that specific suspicion must be required – a similar principle to warrants. This is a vital change which would resolve one of the most notorious parts of the law. The position also introduces judicial oversight of hash lists (Article 44.3), which we welcome. However, it unfortunately does not distinguish between basic hashing (which is generally seen as more robust) and perceptual hashing (which is less reliable).

At the same time, the wording also needs improvement to ensure legal certainty. The Parliament position rightly confirms that scanning must be “targeted and specified and limited to individual users, [or] a specific group of users” (Article 7.1). This means that there must be “reasonable grounds of suspicion a link […] with child sexual abuse material” (Articles 7.1. and 7.2.(a)). However, despite attempts in Recital (21) to interpret the “specific group of users” narrowly, we are concerned that the phrasing “as subscribers to a specific channel of communications”(Article 7.1.) is too broad and too open to interpretation. he concept of “an indirect link” is also ambiguous in the context of private messages, and should be deleted or clarified.

The Parliament’s position deletes solicitation (grooming) detection from the scope of detection orders, recognising the unreliability of such tools. However, the fact that solicitation remains in the scope of risk assessment (Articles 3 and 4) still poses a risk of incentivising overly-restrictive measures.

End-to-end encryption

The European Parliament’s position states that end-to-end encrypted private message services – like WhatsApp, Signal or ProtonMail – are not subject to scanning technologies (Articles 7.1 and 10.3). This is a strong and clear protection to stop encrypted message services from being weakened in a way that could harm everyone that relies on them – a key demand of civil society and technologists.

Several other provisions throughout the text, such as a horizontal protection of encrypted services (Article 1.3a and Recital 9a), give further confirmation of the Parliament’s will to protect one of the only ways we all have to keep our digital information safe.

There is a potential (likely unintended) loophole in the Parliament’s position on end-to-end encryption, however, which must be addressed in future negotiations. This is the fact that whilst encrypted ‘interpersonal communications services’ (private messages) are protected, there is not an explicit protecting for other kinds of encrypted services (‘hosting services’).

It would therefore be important to amend Article 1.3a. to ensure that hosting providers, such as of personal cloud backups, cannot be required to circumvent the security and confidentiality of their services with methods that are designed to access encrypted information, and that Article 7.1. is amended so that it is not limited to interpersonal communications.

Age verification & other risk mitigation measures

The European Parliament’s position is mixed when it comes to age verification and other risk mitigation measures. EDRi has been clear that mandatory age verification at EU level would be very risky – and we are glad to see that these concerns have been acted upon. The European Parliament’s position protects people’s anonymity online by removing mandatory age verification for private message services and app stores, and adds a series of strong safeguards for its optional use (Article 4.3.a.(a)-(k)). This is a positive and important set of measures.

On the other hand, we are disappointed that the Parliament’s position makes age verification mandatory for porn platforms (Article 4a.) – a step that is not coherent with the overall intention of the law. What’s more, the cumulative nature of the risk mitigation measures for services directly targeting children in the Parliament’s position (Article 4.1.(aa)) need further attention.

This is because there is no exception given for cases where the measures might not be right for a particular service, and could instead risk platforms or services deciding to exclude young people from their services to avoid these requirements.

We recommend that there should not be mandatory age verification for porn platforms, and that risk mitigation measures should oblige providers to achieve a specific outcome, rather than creating overly-detailed (and sometimes misguided) service design requirements. We also warn that the overall CSA Regulation framework should not incentivise the use of age verification tools.

Voluntary scanning

The European Parliament’s position does not include a permanent voluntary scanning regime, despite some MEPs calling for such an addition. This is an important legal point: if co-legislators agree that targeted scanning measures are a necessary and proportionate limitation on people’s fundamental human rights, then they cannot leave such measures to the discretion of private entities. The Parliament’s position does – however, extend the currently-in-force ‘interim derogation’ by nine months (Article 88.2).

Web crawling

[Please note that this section was updated on 15 November to remove incorrect information about the powers of Europol to treat unfounded material]

Whilst the overall direction of the Parliament’s position is positive, some new measures have been introduced or maintained, and which are troubling – in particular web crawling. Articles 43.4a. And 49.1. allow the EU Centre to undertake proactive web crawling. Whilst it is positive that this is limited to publicly-accessible content, it should be clarified that this should only apply for the search for known material (part (b) currently allows it also for new material), and that searching under point (ba) should be on a case-by-case basis (not ‘proactive’ / mass crawling). Otherwise there is a risk of mass scanning of public-facing communications using tools which are known to be unreliable at scale.

Interests of commercial entities

Not enough is done in the Parliament’s position to ensure that the EU Center’s technology board will be genuinely independent, and free of the interests of entities that develop or provide scanning technologies. Given the alleged over-closeness of ‘not-for-profit start up’ Thorn to the European Commission in the preparation of this law, it is vital that such interests are not allowed to oversee or decide on the choice of technology., nor that this law will give them a market monopoly over scanning technologies. The independence of the EU Centre’s technology board must be guaranteed and measures introduced to fend off commercial interests or dominance.

In addition to taking the sting out of Chat Control, the Parliament’s position makes a number of other positive moves:

  • Bolstering the prohibition of general monitoring (Article 1.3b);
  • Removing phone calls and SMS messages (number-dependent interpersonal communications services) and audio communications from the scope of the proposal;
  • Increasing the remit of the EU Center to focus on prevention, education and centralising skills and resources, making it a more effective and also more appropriate given that this is a single market law, not a policing law. At the same time, it does not seem to be foreseen how the EU Centre will be able to properly resource this increase in mandate;
  • Establishing a Victims’ Rights and Survivors Consultative Forum to ensure that survivor voices are integrated into the EU’s approach to tackling child sexual abuse. The Parliament also introduces a fundamental rights officer for the EU Center;

In addition to our concerns about ambiguous wording/loopholes, overly-broad Europol access and not strict enough measures to stave off commercial interests, there are other points that remain worrying:

  • Despite making them no longer mandatory, blocking orders (Articles 16-18) remain technically difficult to use, meaning that they are likely to be ineffective;
  • Recital 21a tries to prevent the definition of child sexual abuse from Directive 2011/93/EU from criminalising sexual self-expression between consenting teenagers, but it likely does not go far enough to properly protect adolescents on this point;
  • “Substantial exposure” to CSAM (Article 3.1(a)) has a very low threshold so cannot genuinely be considered “substantial”;
  • Article 22(1)(a) may amount to disproportionate data retention. Whilst this does not seem to be intentional, it should be fixed to ensure that this provision cannot be abused;
  • Searches to ensure compliance (Article 31) continue to allow authorities to scan for ‘new’ material using unreliable tools, which could generate artificial and unreliable results;
  • Despite being a good idea overall, there are problems with the Victims’ Rights and Survivors Consultative Forum remit. The Forum is given a role in the assessment of technologies (Articles 50.1 and Article 66a.4.(c)), despite the composition of the board not requiring any expertise on technology;
  • Whilst the gendered impacts of CSA are recognised throughout the Parliament’s position, other intersections – such as sexuality – are not given the same attention.