Despite warning from lawyers, EU governments push for mass surveillance of our digital private lives

Whilst several EU governments are increasingly alert to why encryption is so important, the Council is split between those that are committed to upholding privacy and digital security in Europe, and those that aren’t. The latest draft Council text does not go anywhere near far enough to make scanning obligations targeted, despite clear warnings from their own lawyers.

By EDRi · July 4, 2023

Earlier this year, a damning opinion about the EU’s proposed CSA Regulation, authored by the lawyers who are responsible for advising EU Member State governments, was made public. Dated 26 April 2023, this opinion sent shock-waves through Brussels and beyond, thanks to its unprecedented level of criticism of an EU legislative proposal.

The opinion warned of “a serious risk that it [the CSA Regulation] would be found to compromise the essence of the rights to privacy and data protection”, would undermine encryption, and would allow “general and indiscriminate access to the content of personal communications” by companies.

These concerns are very similar to those raised by other EU bodies, in particular the European Data Protection Supervisor (EDPS), the European Parliament, and even the European Commission’s own regulatory scrutiny board. This shows a strong formal legal consensus that the proposed Child Sexual Abuse Regulation, put forward by Home Affairs Commissioner Ylva Johansson, falls seriously short of the requirement of proportionality in order to be considered lawful in EU.

In the Council of the EU – the forum in which representatives from each of the EU’s twenty-seven Member States come together to discuss and negotiate on EU laws and policies – a clear split has emerged between those countries that want to protect human rights and the security of digital communications on the one hand, and those who are gambling it away on the false promise of a quick technological solution on the other.

EU governments split on the way forward

Following a whopping sixteen meetings of the Council’s ‘LEWP’, or ‘Law Enforcement Working Party’ (the group leading negotiations for their respective countries on the CSA Regulation), by the beginning of June 2023, there was still a lack of agreement on the most controversial parts of the law. This is likely to be a positive thing: whilst the governments of Spain and Cyprus have made it clear that they see this law as their opportunity to get broad access to people’s private encrypted messages, they have – luckily – not been able to find consensus for this position with other EU governments.

In fact, many national parliamentarians have urged their countries not to agree to the law unless the serious issues are resolved. This includes a resolution from the French senate, a binding motion from the Austrian parliament, a motion from the Dutch parliament, concerns from the Czech and German parliamentary committees, and a letter from the Irish justice committee. These concerns join, among many other initiatives, a call from 134 NGOs to reject the law, and a plea to the Belgian government to ensure the full respect for human rights of Belgian citizens and residents.

The LEWP therefore escalated the discussions to their countries’ ambassadors for a meeting held on 1 June. This meeting was to try to decide what to do about encryption (whether it should be specifically protected or not), audio communications (whether they should be subject to scanning), types of material (whether scanning should be done only for known material using ‘hashing’, or if it can also be done using AI-based indicators to predict abuse in images, videos, behavioural patterns, text and audio), and whether or not to target the proposed measures against suspected persons or allow tech companies to scan potentially every person’s private communications.

Types of material and audio communications

In the 1 June meeting, sixteen EU member state ambassadors indicated their support for keeping all three types of content (known material, new material and grooming) in scope of the proposal’s obligations. However, that means that 40% of the ambassadors were either ambivalent or opposed – with the Netherlands being especially vocal that searching for new CSAM or grooming is clearly the role of police, not of private companies.

EDRi has warned that all three types of content scanning can create serious risks, and especially those using artificial intelligence. Both the European Parliament’s internal markets committee and independent child rights specialist Dr Sabine K Witting have warned against including tools to detect grooming, given their unreliability and unsuitability for this sensitive context.

The most recent proposed Council text in response to these views, however, leaves all three types of content in scope. For grooming detection, the Council Presidency merely proposes to exclude live audio communications (such as voice calls) but to keep other audio communications in scope. This is a small concession, seeing as this would allow measures that amount to wiretapping if the audio is recorded (such as in a voice note). It appears that only Germany and the Netherlands are actively pushing to remove this from the CSA Regulation.

The protection of encryption and private communications

Fortunately, despite the desire of several member states to try to stop people from benefiting from the security and confidentiality offered by end-to-end encrypted communications in Europe, we can see that the protection of encryption is a non-negotiable for the governments of Austria and Germany. Austria has proposed to scrap scanning private messages in this law entirely – a move that EDRi wholeheartedly supports – and Germany to expressly exclude encrypted messages.

Governments of Sweden, Estonia, Portugal, Malta, Czechia, Poland, Finland and Luxembourg also seemed to raise concerns, with several of these countries having official positions to protect encryption. This was sufficient for the current Presidency of the Council of the EU, Sweden, to propose the following draft compromise, with an additional recital to confirm that client-side scanning also cannot be allowed:

“This Regulation shall not prohibit, make impossible, weaken, circumvent or otherwise undermine cybersecurity measures, in particular encryption, including end-to-end encryption, implemented by the relevant information society services or by the users. This Regulation shall not create any obligation to decrypt data.”

If this wording is adopted, it would be very significant. On the other hand, Spain has just taken the role of Council Presidency – meaning that they will be taking over from Sweden in being responsible for putting forward wording for the other countries to review. With the above wording not yet confirmed, we fear that Spain might try to reverse this positive development.

What’s more, despite concerns about the risks of scanning private communications, the latest text from Sweden does not do anything to limit or target scanning obligations to only those reasonably suspected of child sexual abuse offenses. Without such changes, it’s clear from their own legal service that this proposal will remain a serious breach of EU human rights law.

What’s next?

Council documents also show that the European Commission is open to extending the interim derogation (the precursor to the CSA Regulation), exposing their fears that the Council and Parliament are a long way from consensus. At the same time, the interim derogation has been widely criticised, and even the Commission has recognised that its voluntary measures are legally incoherent and cannot become a long-term solution.

Shockingly, the most recent draft Council text explicitly suggests to exclude state communications from the scope of the regulation! This shows that EU governments recognise just how dangerous these measures are for everyone’s private communications, and want to put themselves above the law. Whilst their own communications would be kept safe, the rest of us would be subjected to large-scale violations of our privacy and data protection rights – ‘just in case’ they find that we are doing something wrong.

The struggle of the Member States to find consensus shows how controversial this proposal is, as well as the impossibility of trying to legalise gross digital rights violations. The problems with the draft CSA Regulation are so severe, and would need so many changes to make the proposal lawful, that we still believe the best solution is to withdraw the proposal and go back to the drawing board.

If the EU pushes through a law that is bound to be annulled by the Court of Justice for so clearly violating fundamental rights, it will be the ultimate failure for the children this law is supposed to protect.