Position Paper: Age verification can’t ‘childproof’ the internet
EDRi has published its policy paper on age verification to shed light on the risks of the widespread use of age verification and to chart out possible alternative solutions.
Age verification is the process of predicting or confirming an individual’s age on the internet. And more and more often, we hear from governments and the age verification industry that we need to use this tool to ensure children’s safety online.
For example, in the EU, the draft Child Sexual Abuse (CSA) Regulation proposes to mandate forms of age verification for private message services (e.g. WhatsApp, Signal) and app stores operating in the EU, and to strongly incentivise it for all other digital platforms and services, such as social media.
It is important to ensure that our children access online content that is appropriate for their age. However, quick tech solutions like age verification online will not make the cut. Instead, the adoption of age verification systems may have serious human rights implications, especially for the children they are intended to protect.
“Age verification is often put forward as the ‘obvious’ solution to a wide range of online harms. But this superficial approach fails to recognise the threats posed by these invasive systems. Many such tools rely on exactly the kind of toxic mass data gathering by governments and corporations that EDRi has fought against for decades.”
Tech can’t fix human issues
It is essential to understand that online harms are rooted in societal problems, rather than technical ones. Over-emphasising age verification may lead to ignoring the root problems that facilitate or exacerbate online harm.
A holistic approach is necessary, focusing on supportive measures that prioritise privacy and safety by design, rather than restrict access. For instance, the United Nations and UNICEF recognise children’s rights to freedom of expression and access to information online. Children’s independence and self-development rely on their ability to explore their identity freely, including their sexuality and democratic participation, through online search and communication.
As digital tools become increasingly prevalent in young people’s lives, limiting their access to legitimate online services and content must be approached with extreme caution.
Online anonymity enables people to enjoy their civic liberties
Online anonymity is critical for protecting civic liberties for both children and adults. Encouraging the normalisation of identity documents to participate in society, including for children, may have unintended negative consequences.
EDRi cautions that identity cards should be used only when strictly necessary and in full compliance with EU human rights law and the Convention on the Rights of the Child.
Risks of the use of age verification
Based on the methods and risks analysed in this paper, we define six key risks of the use of age verification tools, in particular:
- Violating children’s privacy and data protection rights;
- Infringing upon children’s autonomy and self-expression online;
- Letting companies control what children can see and do online;
- Making anonymity online difficult or impossible;
- Exacerbating structural discrimination; and
- Creating a false sense of security
After charting out the three main types of age verification, EDRi’s analysis concludes that there are no EU-wide ‘document-based verification’ or ‘estimation’ tools that minimise these risks to the extent that their widespread use could be considered compatible with children’s rights. Age ‘declaration’ tools are more likely to align with children’s rights, but they require additional research and development to improve their effectiveness. As a general rule, therefore, policy- and law-makers must not mandate age estimation or document-based verification measures.
EDRi warns that any law that mandates the use of age verification systems for controlling access to digital platforms and services, such as what is proposed by the CSA Regulation, poses an unjustifiable threat to children’s digital rights and must be rejected. The proposed risk assessment and mitigation process should not incentivise the use of document-based verification and estimation tools.
Read more:
-
Position Paper: State access to encrypted data
EDRi’s new policy paper on encryption highlights that our privacy and security must be strongly protected, keeping into account the recent policy developments on encryption and law enforcement....
-
Position paper: A safe internet for all – Upholding private and secure communications
Despite the importance of its goals, the European Union’s proposed Child Sexual Abuse Regulation (CSAR) will not only fail in its aims to protect young people, but it...
-
CSA Regulation Document Pool
This document pool contains updates and resources on the EU's proposed 'Regulation laying down rules to prevent and combat child sexual abuse' (CSA Regulation)