Voluntary detection measures still on the table for the CSA Regulation

Whilst the draft EU CSA Regulation is intended to replace current voluntary scanning of people's communications with mandatory detection orders, lawmakers in the Council and Parliament are actively considering supplementing this with "voluntary detection orders". However, our analysis finds that voluntary measures would require a legal basis in the CSA Regulation, which would likely fall foul of the Court of Justice. Content warning: contains discussions of child sexual abuse and child sexual abuse material

By EDRi · July 20, 2023

Content warning: contains discussions of child sexual abuse and child sexual abuse material

Bridging the gap between the interim Regulation and the CSA Regulation

The interim derogation from the ePrivacy Directive is set to expire on 3 August 2024. This controversial EU law currently allows voluntary scanning of the private communications of people in the EU by providers of chat, message and email services.While the current plan is to have the new Child Sexual Abuse Regulation (CSAR) adopted before the European elections in June 2024 in order to replace the interim derogation, it’s unlikely that the CSAR, and especially the provisions on detection orders, can be applied from August 2024.

For example, the EU Centre, which has a substantial role in the implementation of detection orders, will not be in operation by August 2024. Moreover, adoption of the CSAR before the European elections is an extremely optimistic timeline, which could be delayed if Council and Parliament find it difficult to reach an agreement on the highly-contested detection orders. After all, every official legal analysis, except the one from the Commission in defence of their own proposal, has concluded that general and indiscriminate detection orders violate fundamental rights. Although there is considerable effort to ignore this legal obstacle, especially among Member States’ governments, these important questions about (the lack of) legality could resurface during the trilogue negotiation between the Council and Parliament.

The most pragmatic solution might be to simply extend the expiry of the interim derogation, either as a transitional provision in the CSA Regulation (if adopted before June 2024) or as a separate legislative proposal. The latter would need to go through the full legislative procedure with a Commission proposal followed by amendments and adoption in Parliament and Council.

However, there is also an interest, especially among center and right lawmakers in the Civil Liberties (LIBE) Committee of the European Parliament, for a permanent provision to allow “voluntary” detection measures.

LIBE amendments on voluntary detection measures

The draft report of rapporteur MEP Javier Zarzalejos (EPP) introduces “voluntary detection orders” in new Article 5 a. The rapporteur’s intention seems to be to make the (real) detection orders proposed by the European Commission a measure of last resort, by allowing service providers to use the same general and indiscriminate detection technologies on a proactive basis. In his proposal, voluntary detection will be considered a risk mitigation measure in Article 4 which, unlike the other risk mitigation measures, will need approval from the competent judicial authority or another independent administrative authority.

That’s because, in order to be seen as cooperative, and to avoid being served with a (real) detection order, providers will logically take the heaviest (and therefore most intrusive) measures that they are permitted to do under Article 4.

The name, voluntary detection orders, is somewhat of a misnomer, and the measure proposed by the rapporteur is indeed more obligatory than voluntary. In their risk assessment (Article 3), providers will be required to assess their use of voluntary detection technologies. As noted in EDRi’s initial assessment of the draft LIBE report, this creates a huge incentive to use voluntary detection in order to avoid a real detection order. Voluntary detection gives private companies greater influence over how detection technologies are deployed on their systems, and there are no mandatory safeguards which must be respected in order to protect fundamental rights of the people using the service. The former reason (greater influence on technology deployed) is likely why industry groups have lobbied for voluntary detection measures in the CSA Regulation.

Other LIBE amendments also deal with supposedly voluntary detection measures, either to avoid real detection orders or to supplement them. AM 695 from a group of mostly EPP MEPs requires voluntary detection measures to be considered in the provider’s risk assessment, but expands on the rapporteur’s AM 73 by clarifying that such voluntary measures shall not undermine end-to-end encryption (E2EE). While this is a well-intended addition, it is highly unlikely that a provider offering E2EE services will want to use voluntary detection, because E2EE makes this technically impossible, at least insofar as scanning of message content is concerned.

In AMs 760, 799 and 825, Renew MEP Lucia Ďuriš Nicholsonová proposes the use of voluntary detection technology as a risk mitigation measure, similar to the approach taken by the rapporteur. There is no requirement for independent judicial authorisation, but AM 825 has a list of safeguards which closely resemble those in Article 3(1) of the interim derogation.

Lastly, in AMs 775/776 (identical) a number of mostly Renew and S&D MEPs, including the Renew shadow rapporteur Hilde Vautmans and the FEMM Opinion rapporteur Heléne Fritzon (S&D), propose to allow companies to continue the voluntary use of detection technologies as mitigation measures (in Article 4) with only prior authorisation from the Coordinating Authority. Since the Coordinating Authority (CA) is also tasked with assessing the provider’s risk assessment and choice of risk mitigation measures, the CA cannot be regarded as an independent administrative authority for the purpose of authorising voluntary detection measures. Moreover, the AMs 775/776 contain no specific safeguards or criteria for assessing whether the voluntary detection measure should be approved.

Voluntary detection and fundamental rights

The use of detection technologies on a general and indiscriminate basis (arbitrarily impacting people who are not suspected of online child sexual abuse) is a particular serious interference with fundamental rights, irrespective of whether the measure is a mandatory (real) detection order issued at the request of the Coordinating Authority, or implemented by the service provider on a “voluntary” basis. In terms of the interference with fundamental rights, it cannot be seen as a mitigating factor that the mass surveillance is voluntary for the private company.

Moreover, since the scanning of private messages and social media posts constitutes a serious interference with fundamental rights, it must be limited to what is strictly necessary for achieving objectives of general interest recognised by the Union. If a particular measure is necessary for such an objective, it logically cannot be voluntary at the same time. This is a general critique of voluntary (law enforcement oriented) measures by private companies, which EDRi has voiced consistently in a number of contexts.

At the Council Law Enforcement Working Party Meeting on 29 March 2023, the European Commission rejected the idea of a parallel legal basis for voluntary detection, as this would undermine the mandatory detection regime. The Commission also noted that measures which are sensitive to fundamental rights cannot be left to the discretion of private companies.

Mandatory detection orders, as in the CSA Regulation proposal, create legal obligations for private service providers which at the same time constitute a legal basis for their processing of personal data and communications data. Since fundamental rights recognised by the Charter of Fundamental Rights of the EU are restricted, the legislation must comply with the conditions of Article 52(1) of the Charter, which says that any interference must be necessary and proportionate. If this is not the case, as with the 2006 Data Retention Directive, the legislation can be struck down by the Court of Justice of the European Union (CJEU).

Based on the CJEU case law regarding general and indiscriminate surveillance measures, legal assessments from the Council Legal Service, the complementary impact assessment requested by the LIBE Committee, the European Data Protection Supervisor and the European Data Protection Board have all concluded that the (mandatory) general and indiscriminate scanning of private messages in the CSA Regulation does not comply with the principle of proportionality required by Article 52(1) of the Charter and is even likely to compromise the essence of the fundamental right to privacy.

The question is whether the legal assessment of general and indiscriminate voluntary detection measures will be any different? Here, it may be useful to distinguish between two different situations.

First, if the CSA Regulation provides a legal basis permitting the use of detection technologies by private companies, that legislation would clearly constitute an interference with fundamental rights which must comply with Article 52(1) of the Charter. If the legislation providing for voluntary detection measures permits general and indiscriminate scanning of private communications, the interference will not be limited to what is strictly necessary under the CJEU case law. A legal opinion from a former CJEU judge commissioned by MEP Patrick Breyer in March 2021 concludes that legislation permitting general and indiscriminate scanning of private messages will be subjected to the same legal standards by the CJEU as legislation requiring it.

Put private companies in charge of solving the legal problems?

Alternatively, the legislation permitting voluntary detection could refrain from providing a legal basis, and instead require that the implemented detection measures comply with the GDPR. This approach is taken in the interim derogation, where recital 10 explicitly states that it does not provide a legal ground for the processing of personal data. What the interim Regulation does is to change the legal framework applicable to number-independent interpersonal communications services from the ePrivacy Directive (ePD) to the GDPR. Whilst the ePD clearly bans voluntary scanning of communications data for the purpose of combatting child sexual abuse, the GDPR has more flexibility for processing personal data. However, the onus is on the private service providers to identify a legal basis for their scanning of all messages.

A recent judgment (case C-252/21) from the CJEU casts considerable doubt on whether processing of personal data for voluntary detection measures is permitted under the GDPR. In the judgment, the CJEU addresses whether Facebook (Meta) can collect and process personal data to combat harmful behaviour and promote safety, integrity and security. The CJEU says that a commercial operator, such as a social media network, cannot rely on point (d) of GDPR Article 6(1), processing necessary to protect the vital interests of other natural persons, as legal basis for this type of processing (paragraph 137 of the judgment) Moreover, legitimate interest (point (f) of GDPR Article 6(1)) cannot be used as a legal basis for processing related to voluntary information sharing with law enforcement agencies (paragraph 124).

In February 2021, the targeted substitute impact assessment on the interim derogation requested by the LIBE Committee identified points (d) and (f) of GDPR Article 6(1) as the only possible legal bases, but the new CJEU judgment seems to rule both of them out for voluntary detection. This kicks the ball back in the EU legislature’s court. If the CSA Regulation is to include voluntary detection measures, the legal basis for the processing of personal data and communications data must be explicitly provided for by the CSA Regulation.

This returns to the crux of the problem: if the CSA Regulation permits companies to scan innocent people’s private communications proactively, with almost no safeguards, it is hard to see how this could be considered necessary, proportionate, and respectful of fundamental rights. And if such scanning is deemed to be necessary in a democratic society, then how could it be left to the discretion of private, usually profit-driven, entities?