Blogs | Open internet and inclusive technology | Artificial intelligence (AI) | Biometrics | Equal access to the internet | Inclusive technologies | Online tracking industry / AdTech | Profiling practices

Digital rights for all

In this article we set out the background to EDRis’ work on anti-discrimination in the digital age. Here we take the first step to explore anti-discrimination as a digital rights issue, and then, what can EDRi do about it? The project is motivated by the need to recognise how oppression, discrimination and inequality impact the enjoyment of digital rights, and to live up to our commitment to uphold the digital rights of all.

By EDRi · July 8, 2020

In this article we set out the background to EDRis’ work on anti-discrimination in the digital age. Here we take the first step to explore anti-discrimination as a digital rights issue, and then, what can EDRi do about it? The project is motivated by the need to recognise how oppression, discrimination and inequality impact the enjoyment of digital rights, and to live up to our commitment to uphold the digital rights of all.


The first half of 2020 has brought with it challenges and shifts of a global scale. From COVID-19 to #BlackLivesMatter – these events necessarily impact EDRi’s work as issues of digital and human rights – our privacy, our safety, and our freedoms, online and off. Not only have these events brought issues of privacy and surveillance to the forefront of global politics, they also teach us about vulnerability.

Vulnerability is not a new concept to digital rights. It is core to the fight to defend rights and freedoms online – we are vulnerable to targeted advertising, to exploitation of our personal data, to censorship, and to increased surveillance. Particularly in times of crisis, this vulnerability is at the same time exposed as it
is exacerbated, with increased surveillance justified for the public good.

How exactly can we understand vulnerability in terms of digital rights? In many senses, this vulnerability is universal. Ever-encroaching threats to our privacy, state surveillance, the mining of data on our personal lives for profit, are all universal threats facing individuals in a digital age.

Yet – just as we have seen that the myth of universal vulnerability in the face of Coronavirus debunked, we are also learning that we are not equally vulnerable to threats to privacy, censorship and surveillance. State and private actors abuse their power in ways that exacerbate injustice, threatens democracy and the rule of law. The way technologies are deployed often amplifies inequalities, especially when location and/or biometric data are used. Taking a leaf out of the book of anti-racism movements – instead of being ‘vulnerable’ to discrimination, exploitation and other harms, we know they are imposed on us. Rather than vulnerable, some groups are marginalised, as active processes with people, institutions and structures of power as the cause.

Going forward, an awareness of how marginalised groups enjoy their digital rights is crucial to a better defence and protection for all. From the black, brown and roma communities who are likely to be impacted by data-driven profiling, predictive policing, and biometric surveillance; the mother who only sees online job advertisements that fit her low-income profile; the child whose online learning experience should not be tainted by harmful content; the undocumented person who does not access health services due to the expectation of deportation and data-sharing, the queer and trans people who rely on anonymity to ensure a safe experience online, the black woman who has had her account suspended for using anti-racist terminologies, to the protester worried about protecting their identity, infringements of ‘digital rights’ manifest differently. Often, the harm cannot be corrected with a GDPR fine alone. It cannot be resolved with better terms and conditions. This is not just a matter of data protection, but of broader violations of human rights in a digital context.

These wider questions of harms and infringements in the digital age will challenge our existing frameworks. Is there a universal ‘subject’ for digital rights? Who are we referring to most often under the term ‘user’? Does this fully recognise the varying degrees of harm we are exposed to? Will the concept of rights holders as ‘users’ help or hinder this nuanced approach? Beyond ‘rights’, how do ideas of equality and justice inform our work?

EDRi members such as Privacy International, have denounced data exploitations and how marginalised groups are disproportionately affected by digital rights violations. Panoptykon have explored how algorithmic profiling systems impact the unemployed in Poland, and integrate the risks of discrimination into their analysis of why the online advertising system is broken. At Privacy Camp, EDRi members are reflecting on how children’s rights, the issues of hate speech online impact our work as a digital rights network. Building on this work EDRi is mapping the organisations, projects and initiatives in the European digital rights field which include a discrimination angle, or that explore how people in different life situations experience digital rights. Once we have a picture of which work is ongoing in the field and the main gaps, we will explore how EDRi can move forward, potentially including further research, campaigns, or efforts to connect digital and non-digital organisations.

We hope that this project will help us to meet our commitment to uphold digital rights for all, and to challenge power imbalance. We are learning that a true universal approach recognises marginalisation in order to contest it. In order to protect digital rights for all we must understand these differences, highlight them, and then fight for collective solutions.

Read more:

Who They Target – Privacy International
https://privacyinternational.org/learn/who-they-target

Profiling the unemployed in Poland: social and political implications of algorithmic decision-making (2015)
https://panoptykon.org/sites/default/files/leadimage-biblioteka/panoptykon_profiling_report_final.pdf

The digital rights of LGBTQ+ people: When technology reinforces societal oppressions (17.07.19)
https://edri.org/the-digital-rights-lgbtq-technology-reinforces-societal-oppressions/

10 Reasons Why Online Advertising is Broken (09.01.2020)
https://en.panoptykon.org/online-advertising-is-broken

More than the sum of our parts: a strategy for the EDRi Network (27.05.20)
https://edri.org/more-than-the-sum-of-our-parts-a-strategy-for-the-edri-network/

COVID-Tech: Surveillance is a pre-existing condition (27.05.2020)
https://edri.org/surveillance-is-a-pre-existing-condition/