Initial wins in Italy just two months after the launch of Reclaim Your Face
Last week, the #ReclaimYourFace campaign reached two important milestones at the national level. On Friday April 16th the Italian Data Protection Authority (DPA) rejected the SARI Real Time facial recognition system acquired by the police saying that the system lacks a legal basis and, as designed, it would implement a form of mass surveillance.
Filter resources
-
Initial wins in Italy just two months after the launch of Reclaim Your Face
Last week, the #ReclaimYourFace campaign reached two important milestones at the national level. On Friday April 16th the Italian Data Protection Authority (DPA) rejected the SARI Real Time facial recognition system acquired by the police saying that the system lacks a legal basis and, as designed, it would implement a form of mass surveillance.
Read more
-
EU’s new artificial intelligence law risks enabling Orwellian surveillance states
When analysing how AI systems might impact people of colour, migrants and other marginalised groups, context matters. Whilst AI developers may be able to predict and prevent some negative biases, for the most part, such systems will inevitably exacerbate injustice. This is because AI systems are deployed in a wider context of systematic discrimination and violence, particularly in the field of policing and migration.
Read more
-
EU’s AI law needs major changes to prevent discrimination and mass surveillance
The European Commission has just launched the its proposed regulation on artificial intelligence (AI). As governments and companies continue to use AI in ways that lead to discrimination and surveillance, the proposed law must go much further to protect people and their rights. Here’s a deeper analysis from the EDRi network, including some initial recommendations for change.
Read more
-
New AI law proposal calls out harms of biometric mass surveillance, but does not resolve them
On 21 April 2021, the European Commission put forward a proposal for a new law on artificial intelligence. With it, the Commission acknowledged some of the numerous threats biometric mass surveillance poses for our freedoms and dignity. However, despite its seemingly good intentions, the proposed law falls seriously short on our demands and does not in fact impose a ban on most cases of biometric mass surveillance – as urged by EDRi and the Reclaim Your Face coalition.
Read more
-
Why EU needs to be wary that AI will increase racial profiling
Central to predictive policing systems is the notion that risk and crime can be objectively and accurately forecasted. Not only is this presumption flawed, it demonstrates a growing commitment to the idea that data can and should be used to quantify, track and predict human behaviour. The increased use of such systems is part of a growing ideology that social issues can be solved by allocating more power, resources - and now technologies - to police.
Read more
-
Regulating Border Tech Experiments in a Hostile World
We are facing a growing panopticon of technology that limits people’s movements, their ability to reunite with their families, and at the worst of times, their ability to stay alive. Power and knowledge monopolies are allowed to exist because there is no unified global regulatory regime governing the use of new technologies, creating laboratories for high-risk experiments with profound impacts on people’s lives.
Read more
-
Computers are binary, people are not: how AI systems undermine LGBTQ identity
Companies and governments are already using AI systems to make decisions that lead to discrimination. When police or government officials rely on them to determine who they should watch, interrogate, or arrest — or even “predict” who will violate the law in the future — there are serious and sometimes fatal consequences. EDRi's member Access Now explain how AI can automate LGBTQ oppression.
Read more
-
EU’s AI proposal must go further to prevent surveillance and discrimination
The European Commission has just launched the EU draft regulation on artificial intelligence (AI). AI systems are being increasingly used in all areas of life – to monitor us at protests, to identify us for access to health and public services, to make predictions about our behaviour or how much ‘risk’ we pose. Without clear safeguards, these systems could further the power imbalance between those who develop and use AI and those who are subject to them.
Read more
-
Civil society calls for stronger protections for fundamental rights in Artificial Intelligence law
In light of the recently leaked draft of the Regulation on A European Approach For Artificial Intelligence from January 2021 , EDRi and 14 of our members signed an open letter to the president of the European Commission Ursula von der Leyen to underline the importance of ensuring the necessary protections for fundamental rights in the new regulation.
Read more
-
Artificial Intelligence and Fundamental Rights: Document Pool
Find in this doc pool all EDRi analyses and documents related to Artificial Intelligence (AI) and fundamental rights
Read more
-
Evidence shows a European future that is dystopian: #ReclaimYourFace now to protect your city
The latest evidence shows that biometric mass surveillance is rapidly being developed and deployed in Europe without a proper legal basis or respect for our agency as self-determined and autonomous individuals. No one is safe, as our most sensitive data like our faces, eyes, skin, palm veins, and fingerprints are being tracked, traced and analysed on social media, in the park, on the bus, or at work.
Read more
-
European Commission must ban biometric mass surveillance practices, say 56 civil society groups
On 1 April, a coalition of 56 human rights, digital rights and social justice organisations sent a letter to European Commissioner for Justice, Didier Reynders, ahead of the long-awaited proposal for new EU laws on artificial intelligence. The coalition is calling on the Commissioner to prohibit uses of biometrics that enable mass surveillance or other dangerous and harmful uses of AI.
Read more