Position paper: A safe internet for all – Upholding private and secure communications

Despite the importance of its goals, the European Union’s proposed Child Sexual Abuse Regulation (CSAR) will not only fail in its aims to protect young people, but it will also even harm those it wants to protect.

By EDRi · October 19, 2022

EDRi has published its position paper which lays out key concerns and encourages policymakers to pursue ideas put forward by child rights groups for whole-of-society solutions to tackle the horrendous crime of child sexual abuse, rather than a harmful surveillance approach.

The EDRi network argues that the proposed CSAR lacks a sufficient legal basis, contradicts substantial portions of EU law, in particular fundamental rights law, adds significant complexity to existing processes which could hamper current national efforts to remove CSAM, and is technically impossible for service providers to implement in a way that respects rights and is effective to achieve its stated aims.

The proposed law demonstrates a naïve faith in technology as a silver bullet, at the detriment of paying attention to the advice of hundreds of cybersecurity and digital human rights experts around the world.

read the full analysis

Read the full analysis

It is imperative for the EU co-legislators, European Parliament and the Council, to reject the proposal because it cannot effectively protect children. The proposal is a fishing exercise, treating all internet users in the EU as potential child sexual abuse perpetrators. By casting a wide net instead of starting with reasonable suspicion of individual perpetrators, the proposal turns the presumption of innocence on its head and inverts the rule of law, due process and the right to the presumption of innocence. 

For example, the story the New York Times broke in September tells the experience of a father, whose child was sick and had to send photos to his general practitioner. The images got detected as CSAM and he was wrongfully identified by Google’s artificial intelligence of child sexual abuse and was then reported to the police even after Google’s human revision. This real-life story shows that the mass scanning of all messages online is not effective, and thus, the CSAR’s intention to tackle child sexual abuse through quick tech measures is deemed to fail. But if not rejected, this law’s failure will amplify insecurities for young people because it is harmful to children to be subjected to generalised digital surveillance and denied safe, private online spaces.

Child sexual abuse survivor and privacy advocate Alexander Hanff explains that intrusive internet monitoring regulations deprive survivors of safe spaces and can also disincentivise them from seeking help.

The key issues raised by CSAR proposal

In its position paper, EDRi raises many serious concerns about the technical and practical infeasibility of the highly complicated proposal, along with procedural concerns that the proposed solutions could make it harder for law enforcement agencies to investigate and prosecute perpetrators of CSA. EDRi also warns that the proposal fails to sufficiently engage with preventive and societal measures which could stop this problem from existing in the first place.

  •  Poses an unjustifiable limitation on the human right to privacy. 

  •  Interferes with a wide range of other rights and freedoms, including free expression, and children’s rights.

  •  Is highly likely to amount to general monitoring of a wide range of digital communications.

  • As EDRi has repeatedly argued, such detection measures will inevitably lead to dangerous and unreliable Client-Side Scanning practices, undermining the essence of end-to-end encryption.

  • Cybersecurity experts and technologists all around the world have warned that you cannot safely or effectively circumvent encryption in this way. Such interference with the encryption process will also make everyone’s phones vulnerable to attacks from malicious actors. 

  • Alternatively, the proposal may simply incentivise providers to abandon encryption entirely. 

  •  Known CSAM detection has significant error rates and the claims of accuracy by the European Commission are misleading.

  •  AI-based technology is fundamentally ill-suited to identifying the context-dependent crime of CSAM.

  • Grooming detection is unreliable, threatens the presumption of innocence, and cannot provide reliable evidence for criminal investigations.

  • Risk assessments will smuggle in all sorts of dangerous surveillance measures, as well as censorship and age verification. 

  • Detection Orders have been designed in a way which can never be targeted or proportionate, meaning that no safeguards are sufficient to mitigate the harm they pose.

  • The CSAR can force providers to stop young people from accessing legitimate digital private communications and other privacy-respecting digital services, such as end-to-end encrypted messenger services., This could pose an especially severe risk to young people whose abuse is committed by someone in their family environment, thereby removing one of their means of seeking help.

  •  It encourages the use of digital identity as a precursor to accessing all digital communications, including for under-18s, which could further exclude vulnerable young people (such as undocumented children, Roma children, and those facing other forms of structural exclusion and isolation) who already face barriers to accessing digital ID systems.

  •  The proposal will also make it harder for young people to access secure, private communications, as a result of the weakening of encryption. 

There is no silver bullet to tackling child sexual abuse

Instead of implementing surveillance measures that ultimately are likely to do more harm than good, the EDRi network suggests that lawmakers focus on alternative, structural measures to tackle the root of the horrific crime of child sexual abuse, for example: 

  • Education, awareness-raising and empowerment of survivors 
  • Social and structural change 
  • Reform police and other institutions 
  • Invest in child protection hotlines
  • Enforce existing rules
  • Bring together child rights groups, educators, social workers, digital rights groups, and other human rights groups to work together on solutions

What’s EDRi doing?

Along with 117 other civil society groups – including those working on children’s digital rights, children’s health, support for victims of online abuse, as well as the empowerment of girls and women – EDRi has been calling on the EU to withdraw the CSAR and pursue alternative measures that are more likely to be effective, sustainable and fully respect EU fundamental rights. Now, we have launched a Europe-wide collective action that will pressure the EU’s co-legislators, European Parliament and the Council, to stop EU attempts to scan every move we make by rejecting the CSAR proposal.