Companies are now allowed to scan your private communications

“Any restrictions on children’s right to freedom of expression in the digital environment, such as filters, including safety measures, should be lawful, necessary and proportionate”and any digital surveillance of children “should respect the child’s right to privacy and should not be conducted routinely, indiscriminately” nor “should it take place without the right to object to such surveillance”.

By EDRi · May 5, 2021

“I didn’t have confidential communications tools when I was raped; all my communications were monitored by my abusers – there was nothing I could do, there was no confidence. […] I can’t help but wonder how much different my life would have been had I had access to these modern technologies. [The planned vote on the e-Privacy Derogation] will drive abuse underground making it far more difficult to detect; it will inhibit support groups from being able to help abuse victims – IT WILL DESTROY LIVES.”
Alexander Hanff (Victim of Child Abuse and Privacy Activist): “EU Parliament are about to pass a derogation which will result in the total surveillance of over 500M Europeans” (4 December 2020)

On 29 April 2021, the European Union co-legislators (EU Member States and the European Parliament (EP) reached a provisional agreement (subject to a formal approval) on a temporary legislation to allow providers of electronic communications services such as web-based email and messaging services to continue to detect, remove and report child sexual abuse material (CSAM) online. The temporary legislation has removed protections of confidential conversations between lawyers and their clients and doctors and patients. Furthermore the interim legislation would cover anti-grooming practices. Contrary to scanning known illegal images, detecting grooming requires scanning entire conversations.

Wait, what? How did this happen?

In July 2020 the European Commission published a Communication on an EU strategy for a more effective fight against child sexual abuse material (CSAM). In September, the Commission proposed an interim Regulation as a follow up that contained a number of problematic proposals. It was harshly criticised by the European Data Protection Supervisor (EDPS) and the European Parliamentary Research Service (EPRS). It also contradicted some proposals put forward by UNICEF.

The regulation was proposed because by December 2020, the European Electronic Communications Code (EECC) was going to enter into effect, changing the legal definition of an “electronic communications service” to include Over The Top (OTT) services such as WhatsApp, Instagram messaging, Facebook messenger, etc. This automatically extends the scope of the ePrivacy Directive to cover OTT services. Therefore, all the provisions that protect privacy and confidentiality of communications will apply to them as well.

Is this a done deal?

After the agreement between the EP and the Council, the interim legislation is on its way to entering into force. However, this is an interim legislation and a long-term legislation is being prepared by the European Commission. EDRi shared its concerns via a response to the public consultation and added some additional responses. There is time to undo the harms that the interim legislation has done, but this will not happen unless a significant and varied group of voices join child rights groups in their quest to protect children from both the spread of child abuse material and of child surveillance. Children need both privacy and security.

United Nations: Bringing clarity to the debate

The recently approved UN Committee on the Rights of the Child has adopted the general comment no. 25 (2021) on children’s rights in relation to the digital environment which asks “to ensure that their views are considered seriously and that children’s participation does not result in undue monitoring or data collection that violates their right to privacy, freedom of thought and opinion”(para.18) that “any restrictions on children’s right to freedom of expression in the digital environment, such as filters, including safety measures, should be lawful, necessary and proportionate” (para. 59) and that any digital surveillance of children “should respect the child’s right to privacy and should not be conducted routinely, indiscriminately” nor “should it take place without the right to object to such surveillance”.

What’s next? Can I do something?

The long-term legislation is expected before this summer. Even though a longer, more muted debate is to be expected, the approval of the interim legislation could lead to all the harmful proposals to be assumed as already agreed upon by the European Parliament and Member States. Unless children’s rights groups and other human rights organisations speak out uniformly to protect both privacy and security of children, we risk falling down the slippery slope taken by the EU for issues with similar consequences such as terrorist content, copyright infringement and now CSAM. Unless we halt the regulation now, new “exceptions” to the rule against mandatory monitoring of communications will continue to appear. EDRi warns that at some point, the exception could become the rule. If as a recent poll states: 72% of citizens oppose EU plans to search all private messages for allegedly illegal material and report to the police, there is much to change in current EU policy making.

Diego Naranjo

Head of Policy

Twitter: @DNBSevilla