Digital Dignity Document Pool
Digital technologies can have a profound effect on our societies, but sufficient attention is rarely given to how certain applications differentiate between, target and experiment on communities at the margins. This document pool gathers resources for those that are interested in learning about and contesting the harms to dignity and equality that arise from uses of technology and data.
How can we live in dignity when technological ‘progress’ puts us at risk? What does it mean to strive for equality when technologies benefit some but marginalise others? How do we get justice when technology is used as a cloak for discrimination?
These questions and more underpin our work on “Digital Dignity”. Whilst dignity is often thought of as one of the most difficult rights to understand and apply, it can give us a useful lens to interrogate the problems associated with certain applications of AI and other digital technologies.
In this document pool we will be listing articles, documents and resources which relate to dignity in the digital environment:
“Human dignity is inviolable. It must be respected and protected.”
Dignity is an important component of protecting our autonomy, our freedom, our self-respect and self-determination, others’ respect for us and of course, our rights to live equally, without being discriminated against, and with equitable access to justice. Developments in digital technology increasingly impact human dignity. Data about us can have a really profound impact on the ways in which we live our lives and whether we feel that our bodies and even our very existence are treated with respect.
Blogs from the Digital Dignity coalition
Read the blogs from EDRi and other members of the coalition about the formation and growth of our group, how we are working to collectively center the communities and voices that are most impacted by developments in technology, and the different angles and topics that make up our work. We'll be publishing more content to explore digital dignity from different perspectives as we go.
-
Building a coalition for Digital Dignity
In 2020 EDRi started to build the ‘Digital Dignity Coalition’, a group of organisations and activists active at the EU level dedicated to upholding rights in digital spaces and resisting harmful uses of technology. We’ve been organising to understand and resist how technological practices differentiate, target and experiment on communities at the margins - this article sets out what we’ve done so far.
Read more
-
Romani rights and biometric mass surveillance
The rights of Romani people should be an important topic for anyone that cares about digital rights. In this blog, hear from experts in Roma, Sinti and digital rights about why facial recognition is an important issue (and what the rest of the digital rights community can learn), and check out the Reclaim Your Face campaign’s first ever resource in the Sinti language!
Read more
-
Digital Dignity Workshops to explore intersection of rights, justice & AI / biometrics
It is clear that both the benefits and the harms of emerging technologies are not distributed equally, and experiences of technology are frequently discriminatory. The start of this collaboration will consist of two online ‘Digital Dignity’ workshops, to be held on Monday 7 December, 13:00 – 15:30 CET and Thursday 10 December, 13:00 – 15:30 CET.
Read more
EDRi's work on the intersection of dignity and discrimination
Dignity is implicated in many uses of technology. Check out our broader work on issues of dignity when it comes to social media platforms, debiaising data, decolonising the digital rights field and more.
-
If AI is the problem, is debiasing the solution?
The development and deployment of artificial intelligence (AI) in all areas of public life have raised many concerns about the harmful consequences on society, in particular the impact on marginalised communities. EDRi's latest report "Beyond Debiasing: Regulating AI and its Inequalities", authored by Agathe Balayn and Dr. Seda Gürses,* argues that policymakers must tackle the root causes of the power imbalances caused by the pervasive use of AI systems. In promoting technical ‘debiasing’ as the main solution to AI driven structural inequality, we risk vastly underestimating the scale of the social, economic and political problems AI systems can inflict.
Read more
-
Computers are binary, people are not: how AI systems undermine LGBTQ identity
Companies and governments are already using AI systems to make decisions that lead to discrimination. When police or government officials rely on them to determine who they should watch, interrogate, or arrest — or even “predict” who will violate the law in the future — there are serious and sometimes fatal consequences. EDRi's member Access Now explain how AI can automate LGBTQ oppression.
Read more
-
Creating Conditions for a Decolonised Digital Rights Field
Since 2019, DFF and EDRi have been working to initiate a decolonising process for the digital rights field. Reflecting on the increased challenges to our digital rights, we realised how imperative it is that the field truly reflects everyone in European society. This means improving representation in the digital rights field, but more crucially undoing the power structures preventing us from protecting digital rights for everybody.
Read more
-
Envisioning a Decolonised Digital Rights Field – and Charting Next Steps
This week, a group of 30 participants, working on issues of racial, social and economic justice, digital rights, and in philanthropy, came together to not only collectively imagine just that, but also to identify the building blocks for a process that might help us get there.
Read more
-
Facial recognition and fundamental rights 101
This is the first post in a series about the fundamental rights impacts of facial recognition. Private companies and governments worldwide are already experimenting with facial recognition technology. Individuals, lawmakers, developers - and everyone in between - should be aware of the rise of facial recognition, and the risks it poses to rights to privacy, freedom, democracy and non-discrimination.
Read more
-
The digital rights of LGBTQ+ people: When technology reinforces societal oppressions
Online surveillance and censorship impact everyone’s rights, and particularly those of already marginalised groups such as lesbian, gay, bisexual, transgender and queer and others (LGBTQ+) people. The use of new technologies usually reinforces existing societal biases, making those communities particularly prone to discrimination and security threats. As a follow-up to Pride Month, here is an […]
Read more
- EDRi, ‘Artificial Intelligence & Fundamental Rights: How AI impacts marginalized groups, justice and equality’ (June 2020)
- UN Special Rapporteur, ‘Racial discrimination and emerging digital technologies: a human rights analysis: Report of the Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance’ (June 2020)
- EDRi, ‘Digital rights for all’ (July 2020)
- European Disability Forum (EDF), ‘Artificial Intelligence’ (January 2019)
- PICUM, ‘Data Collection and Digital Technologies’ [resource page]
- European Sex Workers Rights Alliance (ESWA), ‘ICRSE statement regarding OnlyFans’ explicit content ban and the exclusion of sex workers from digital platforms’ (September 2021)
- European Network Against Racism (ENAR), ‘Data-driven profiling: the hardwiring of discriminatory policing practices across Europe’ (November 2019)
Reports & Evidence
Explore reports from EDRi which explore the many ways in which uses of technology can target and threaten the dignity of people on the move and at borders, LGBTQI+ people, racialised people, those visiting a place of worship and human rights defenders.
-
New EDRi report reveals depths of biometric mass surveillance in Germany, the Netherlands and Poland
In a new research report, EDRi reveals the shocking extent of unlawful biometric mass surveillance practices in Germany, the Netherlands and Poland which are taking over our public spaces like train stations, streets, and shops. The EU and its Member States must act now to set clear legal limits to these practices which create a state of permanent monitoring, profiling and tracking of people.
Read more
-
Technological Testing Grounds: Border tech is experimenting with people’s lives
The European Union is increasingly experimenting with high risk migration management technologies.
Read more
Open letters
See the letters that EDRi has coordinated along with members of the digital dignity coalition and other civil society groups to call for respect for dignity and equality in EU laws and policies.
-
Civil society calls for stronger protections for fundamental rights in Artificial Intelligence law
In light of the recently leaked draft of the Regulation on A European Approach For Artificial Intelligence from January 2021 , EDRi and 14 of our members signed an open letter to the president of the European Commission Ursula von der Leyen to underline the importance of ensuring the necessary protections for fundamental rights in the new regulation.
Read more
-
European Commission must ban biometric mass surveillance practices, say 56 civil society groups
On 1 April, a coalition of 56 human rights, digital rights and social justice organisations sent a letter to European Commissioner for Justice, Didier Reynders, ahead of the long-awaited proposal for new EU laws on artificial intelligence. The coalition is calling on the Commissioner to prohibit uses of biometrics that enable mass surveillance or other dangerous and harmful uses of AI.
Read more
-
Civil society calls for AI red lines in the European Union’s Artificial Intelligence proposal
European Digital Rights together with 61 civil society organisations have sent an open letter to the European Commission demanding red lines for the applications of AI that threaten fundamental rights.
Read more
- Want to be a part of the Digital Dignity coalition? Please send a message to our co-ordinators: Ella Jakubowska, Sarah Chander and Fenya Fischer
- Please note that the coalition is specifically intended to be a constructive and supportive space for activists, NGO staff or volunteers, advocates and campaigners working across human rights and social justice from a variety of anti-discrimination and/or inclusion perspectives. You don’t need to know anything about data, artificial intelligence or digital rights to join us.
- Want to get involved in the related campaign against discriminatory biometric mass surveillance? Join the Reclaim Your Face campaign