News from Ireland question effectiveness and lawfulness of online scanning for tackling child sexual abuse: Lessons for the EU

An investigation in Ireland published today shows that tools for scanning private communications to detect child sexual abuse material (CSAM) online suffer not only from low accuracy and high rates of false alarms but have led to people’s data and privacy being put in danger without reasonable suspicion.

By EDRi · October 15, 2022

An investigation in Ireland published today shows that tools for scanning private communications to detect child sexual abuse material (CSAM) online suffer not only from low accuracy and high rates of false alarms but have led to people’s data and privacy being put in danger without reasonable suspicion.

The European Union’s recently-proposed Child Sexual Abuse Regulation (CSAR) relies on similar ineffective tools that threaten to fundamentally undermine secure communication online and in effect make the internet less safe for everyone.

If passed, CSAR will turn the internet into a space that is harmful to everyone’s privacy, security and free expression – with severe consequences for youth, human rights defenders, journalists and those relying on secrecy to protect their work and civil organising. It will also harm the very children that the legislation aims to protect.

Today, EDRi launches together with 13 organisations, the “Stop Scanning Me” campaign. The coalition asks for the rejection of the European Commission’s proposal.

The new Irish case: False positive referrals hinder police efforts to tackle CSAM

EDRi affiliate the Irish Council for Civil Liberties (ICCL) has just published data which reveals that from all referrals of CSAM online that the US National Center for Missing and Exploited Children (NCMEC) sent to the Irish police authority in 2020, many of which result from scanning practices, just 20% actually constituted CSAM.

Fewer than 10% were actionable, and information was not provided about whether any of these actionable reports led to prosecutions or convictions. At least 1 in 10 reports were false alarms, including innocent people who have shared images or videos of children playing on a beach, or legitimate content by adults, such as topless, nudist or ‘adult’ content.

“The creation and circulation of CSAM online or offline is a heinous crime. Effective measures must be taken to protect the rights and freedoms of victims and survivors,” said Olga Cronin of the Irish Council for Civil Liberties (ICCL). “However, the figures we present today question the current efforts to combat it. Innocent people have been unlawfully kept in a net of surveillance and suspicion with no cause.”

The case study also shows that the Irish police retained data in their CSA database (email, screen name, IP address) about the innocent people who were reported on the basis of false alarms. This may constitute an illegal breach of EU data protection laws as well as a violation of due process rules and a suppression of free expression.

This case study challenges the European Commission’s claims that scanning technology is highly accurate and reliable, and that false reports are negligible and harmless.

The large amount of false positive referrals could even hinder the police’s capacity to effectively tackle the spread of CSAM, as they will be tied up with even more false alarms under the EU’s CSA Regulation. This has been confirmed, for example, by the Dutch police, who say that they will not be able to handle the expected number of reports under the new EU law.

“As a society, we cannot afford to put energy into counterproductive measures given how serious this crime is. We need to do far much better, for example by investing in the sex crime unites of the police and focusing on prevention. When police forces tell us they don’t have the capacity to deal with the European Commission’s new law proposal, we need to wonder whether EU is actually failing the children it aims to protect.” – points Rejo Zenger from the Dutch organisation Bits of Freedom

EDRi’s new position paper: The EU must pursue alternatives to CSAR

Today, 19 October, EDRi published its position paper which shows that the proposed CSA law poses a serious threat to everyone’s privacy, security and free expression online – including the children it aims to protect. It likely lacks a sufficient legal basis, contradicts substantial portions of EU law, in particular fundamental rights law, it could hamper current efforts to remove CSAM, and is technically impossible to implement in a way that is effective and which respects rights.

Some of the recommendations in EDRi’s paper include focusing on education, awareness-raising and empowerment of survivors, social and structural change, reforming the police and other institutions, investing in child protection services, and bringing together child rights and digital rights groups, as well as other experts, to work together on solutions.

However, the proposed CSAR demonstrates a naïve faith in technology as a silver bullet, at the detriment of paying attention to the advice of hundreds of cybersecurity and digital human rights experts around the world.

The proposal also fails to sufficiently engage with preventive and societal measures which could stop this problem from existing in the first place.

In addition, the CSAR threatens to fundamentally undermine end-to-end encryption, a vital human rights tool relied upon by people all across the world, and will incentivise service providers to take the most intrusive measures possible in order to avoid legal consequences.

Such detection measures will inevitably lead to dangerous Client-Side Scanning practices which undermine people’s right to private and secure communication. This will have an even graver impact on journalists, human rights defenders, survivors of child sexual abuse and intimate partner violence, especially those at risk of stalking by a (former) partner and young people communicating legitimately, especially LGBTQ+ young people.

The proposed law treats all internet users in the EU as potential child sexual abuse perpetrators instead of starting with reasonable suspicion of individual perpetrators. This violates the presumption of innocence and the rule of law principles.

Read the full analysis

EDRi urges the co-legislators to take the issue of CSAM seriously by ensuring that laws mandating the use of digital technology are realistic, achievable, lawful, rights-respecting and actually effective. 

Check out EDRi’s analysis and recommendations

We must act now!

“By ramping up pressure and moral panic in an attempt to hastily push through a law that’s incompatible with EU rights and values, Commissioner Johansson may be sealing her own proposal’s fate.” – Ella Jakubowska, Senior Policy Advisor, EDRi

Along with 117 other civil society groups EDRi called through an open letter to the EU to withdraw the CSAR and pursue alternative measures that are more likely to be effective, sustainable and fully respect EU fundamental rights. Letter signatories included those working on children’s digital rights, children’s health, support for victims of online abuse, as well as the empowerment of girls and women.

Now, we have launched a Europe-wide collective action that will call on the EU’s co-legislators, European Parliament and the Council, to stop EU attempts to scan almost every move we make online by rejecting the CSAR proposal.

#StopScanningMe: Join the movement