Artificial intelligence (AI)
Artificial intelligence (AI) refers to a broad range of processes and technologies enabling computers to complement or replace tasks otherwise performed by humans. Such systems have the ability to exacerbate surveillance and intrusion into our personal lives, reflect and reinforce some of the deepest societal inequalities, fundamentally alter the delivery of public and essential services, undermine vital data protection legislation, and disrupt the democratic process itself. In the face of this, EDRi strives to uphold our fundamental rights, democracy, equality and justice in all legislation, policy and practice related to artificial intelligence.
Filter resources
-
Human rights focus missing in the State of the Union 2021 address
On 15 September, the yearly State of the Union 2021 address took place. The address is the event where the European Commission evaluates the preceding year and the Commission President announces key legislation or reactions to crucial international events.
Read more
-
Digital Dignity Document Pool
Digital technologies can have a profound effect on our societies, but sufficient attention is rarely given to how certain applications differentiate between, target and experiment on communities at the margins. This document pool gathers resources for those that are interested in learning about and contesting the harms to dignity and equality that arise from uses of technology and data.
Read more
-
If AI is the problem, is debiasing the solution?
The development and deployment of artificial intelligence (AI) in all areas of public life have raised many concerns about the harmful consequences on society, in particular the impact on marginalised communities. EDRi's latest report "Beyond Debiasing: Regulating AI and its Inequalities", authored by Agathe Balayn and Dr. Seda Gürses,* argues that policymakers must tackle the root causes of the power imbalances caused by the pervasive use of AI systems. In promoting technical ‘debiasing’ as the main solution to AI driven structural inequality, we risk vastly underestimating the scale of the social, economic and political problems AI systems can inflict.
Read more
-
EDRi submits response to the European Commission AI adoption consultation
Today, 3rd of August 2021, European Digital Rights (EDRi) submitted its response to the European Commission’s adoption consultation on the Artificial Intelligence Act (AIA).
Read more
-
No place for emotion recognition technologies in Italian museums
An Italian museum trials emotion recognition systems, despite the practice being heavily criticised by data protection authorities, scholars and civil society. The ShareArt system collects, among others, age, gender and emotions of people. EDRi member Hermes Center called the DPA for an investigation.
Read more
-
They can hear you: 6 ways tech is listening to you
Voice recognition technology often violates human rights, and it’s popping up more and more. Recently EDRi's member Access Now called out Spotify for developing voice recognition tech that claims to be able to detect gender and emotional state, among other things. But it’s not just Spotify. Some of the most powerful companies in the world are deploying similar abusive tech because harvesting data about you is profitable. The market for voice recognition is growing, expected to be worth a whopping $26.8 billion by 2025.
Read more
-
EU privacy regulators and Parliament demand AI and biometrics red lines
In their Joint Opinion on the AI Act, the EDPS and EDPB “call for [a] ban on [the] use of AI for automated recognition of human features in publicly accessible spaces, and some other uses of AI that can lead to unfair discrimination”. Taking the strongest stance yet, the Joint Opinion explains that “intrusive forms of AI – especially those who may affect human dignity – are to be seen as prohibited” on fundamental rights grounds.
Read more
-
New EDRi report reveals depths of biometric mass surveillance in Germany, the Netherlands and Poland
In a new research report, EDRi reveals the shocking extent of unlawful biometric mass surveillance practices in Germany, the Netherlands and Poland which are taking over our public spaces like train stations, streets, and shops. The EU and its Member States must act now to set clear legal limits to these practices which create a state of permanent monitoring, profiling and tracking of people.
Read more
-
The #PaperBagSociety challenge
The #PaperBagSociety is a social media challenge part of the #ReclaimYourFace campaign that invites everyone to share online the impact of living life with a paperbag on the head. With it, we aim to raise awareness of how ridiculous is to avoid facial recognition technologies in public spaces and why we need to build an alternative future, free from biometric mass surveillance.
Read more
-
Workplace, public space: workers organising in the age of facial recognition
‘Surveillance capitalism’ is increasingly threatening workers’ collective action and the human right to public protest.
Read more
-
EDRi joins 178 organisations in global call to ban biometric surveillance
From protesters taking to the streets in Slovenia, to the subways of São Paulo; from so-called “smart cities” in India, to children entering French high schools; from EU border control experiments, to the racialised over-policing of people of colour in the US. In each of these examples, people around the world are increasingly and pervasively being subjected to toxic biometric surveillance. This is why EDRi has joined the global Ban Biometric Surveillance coalition, to build on our work in Europe as part of the powerful Reclaim Your Face campaign.
Read more
-
The urgent need to #reclaimyourface
The rise of automated video surveillance is often touted as a quick, easy, and efficient solution to complex societal problems. In reality, roll-outs of facial recognition and other biometric mass surveillance tools constitute a systematic invasion into people’s fundamental rights to privacy and data protection. Like with uses of toxic chemicals, these toxic uses of biometric surveillance technologies need to be banned across Europe.
Read more