EDRi and 41 human rights organisations call on the European Parliament to reject amendments to AI and criminal law report
EDRi and 41 human rights organisations* call on the members of the European Parliament to vote against the new amendments, which enable discriminatory predictive policing and biometric mass surveillance.
Filter resources
-
EDRi and 41 human rights organisations call on the European Parliament to reject amendments to AI and criminal law report
EDRi and 41 human rights organisations* call on the members of the European Parliament to vote against the new amendments, which enable discriminatory predictive policing and biometric mass surveillance.
Read more
-
Building a coalition for Digital Dignity
In 2020 EDRi started to build the ‘Digital Dignity Coalition’, a group of organisations and activists active at the EU level dedicated to upholding rights in digital spaces and resisting harmful uses of technology. We’ve been organising to understand and resist how technological practices differentiate, target and experiment on communities at the margins - this article sets out what we’ve done so far.
Read more
-
Digital Dignity Document Pool
Digital technologies can have a profound effect on our societies, but sufficient attention is rarely given to how certain applications differentiate between, target and experiment on communities at the margins. This document pool gathers resources for those that are interested in learning about and contesting the harms to dignity and equality that arise from uses of technology and data.
Read more
-
If AI is the problem, is debiasing the solution?
The development and deployment of artificial intelligence (AI) in all areas of public life have raised many concerns about the harmful consequences on society, in particular the impact on marginalised communities. EDRi's latest report "Beyond Debiasing: Regulating AI and its Inequalities", authored by Agathe Balayn and Dr. Seda Gürses,* argues that policymakers must tackle the root causes of the power imbalances caused by the pervasive use of AI systems. In promoting technical ‘debiasing’ as the main solution to AI driven structural inequality, we risk vastly underestimating the scale of the social, economic and political problems AI systems can inflict.
Read more
-
Who, What, Why? Your guide to all things ECI! #ReclaimYourFace
Calling all digital rights heroes: EDRi needs your support! As part of the Reclaim Your Face campaign, we are running a European Citizens’ Initiative (ECI) to ban biometric mass surveillance practices in the EU. To be successful, we need to collect 1 million signatures. Read more to find out how we keep ECI data safe, and how your signature can make a big difference.
Read more
-
Roma & Sinti rights, Resistance & Facial Recognition: RYF in Conversation…
For communities that have been historically sidelined, the promises of digitalisation can instead become a vessel for yet more discrimination and unequal treatment. Facial recognition in particular has a sinister and dark history which links to the persecution of Romani communities. If you missed our webinar on Roma and Sinti rights and the rise of facial recognition across Europe, you can catch up here and learn what the digital rights community can and should do!
Read more
-
Romani rights and biometric mass surveillance
The rights of Romani people should be an important topic for anyone that cares about digital rights. In this blog, hear from experts in Roma, Sinti and digital rights about why facial recognition is an important issue (and what the rest of the digital rights community can learn), and check out the Reclaim Your Face campaign’s first ever resource in the Sinti language!
Read more
-
EDRi submits response to the European Commission AI adoption consultation
Today, 3rd of August 2021, European Digital Rights (EDRi) submitted its response to the European Commission’s adoption consultation on the Artificial Intelligence Act (AIA).
Read more
-
No place for emotion recognition technologies in Italian museums
An Italian museum trials emotion recognition systems, despite the practice being heavily criticised by data protection authorities, scholars and civil society. The ShareArt system collects, among others, age, gender and emotions of people. EDRi member Hermes Center called the DPA for an investigation.
Read more
-
They can hear you: 6 ways tech is listening to you
Voice recognition technology often violates human rights, and it’s popping up more and more. Recently EDRi's member Access Now called out Spotify for developing voice recognition tech that claims to be able to detect gender and emotional state, among other things. But it’s not just Spotify. Some of the most powerful companies in the world are deploying similar abusive tech because harvesting data about you is profitable. The market for voice recognition is growing, expected to be worth a whopping $26.8 billion by 2025.
Read more
-
EU privacy regulators and Parliament demand AI and biometrics red lines
In their Joint Opinion on the AI Act, the EDPS and EDPB “call for [a] ban on [the] use of AI for automated recognition of human features in publicly accessible spaces, and some other uses of AI that can lead to unfair discrimination”. Taking the strongest stance yet, the Joint Opinion explains that “intrusive forms of AI – especially those who may affect human dignity – are to be seen as prohibited” on fundamental rights grounds.
Read more
-
Biometric mass surveillance flourishes in Germany and the Netherlands
In a new research report, EDRi reveals the shocking extent of biometric mass surveillance practices in Germany, the Netherlands and Poland which are taking over our public spaces like train stations, streets, and shops. The EU and its Member States must act now to set clear legal limits to these practices which create a state of permanent monitoring, profiling and tracking of people.
Read more