EU’s AI law needs major changes to prevent discrimination and mass surveillance
The European Commission has just launched the its proposed regulation on artificial intelligence (AI). As governments and companies continue to use AI in ways that lead to discrimination and surveillance, the proposed law must go much further to protect people and their rights. Here’s a deeper analysis from the EDRi network, including some initial recommendations for change.
Filter resources
-
EU’s AI law needs major changes to prevent discrimination and mass surveillance
The European Commission has just launched the its proposed regulation on artificial intelligence (AI). As governments and companies continue to use AI in ways that lead to discrimination and surveillance, the proposed law must go much further to protect people and their rights. Here’s a deeper analysis from the EDRi network, including some initial recommendations for change.
Read more
-
New AI law proposal calls out harms of biometric mass surveillance, but does not resolve them
On 21 April 2021, the European Commission put forward a proposal for a new law on artificial intelligence. With it, the Commission acknowledged some of the numerous threats biometric mass surveillance poses for our freedoms and dignity. However, despite its seemingly good intentions, the proposed law falls seriously short on our demands and does not in fact impose a ban on most cases of biometric mass surveillance – as urged by EDRi and the Reclaim Your Face coalition.
Read more
-
Luca contact tracing app: CCC calls for an immediate moratorium
A dubious business model, defective software, irregularities in the awarding of contracts: EDRi member, Chaos Computer Club (CCC) demands an immediate end to federal funding for the “Luca” contact tracing app.
Read more
-
Why EU needs to be wary that AI will increase racial profiling
Central to predictive policing systems is the notion that risk and crime can be objectively and accurately forecasted. Not only is this presumption flawed, it demonstrates a growing commitment to the idea that data can and should be used to quantify, track and predict human behaviour. The increased use of such systems is part of a growing ideology that social issues can be solved by allocating more power, resources - and now technologies - to police.
Read more
-
Upcoming judgment against mass surveillance in France
On Wednesday 21 April, the Conseil d'Etat (France's highest administrative court) will issue its final decision in the most important case that EDRi's observer La Quadrature du Net (LQDN) has ever brought against the intelligence services. This will be the end of six years of proceedings, dozens of briefs and countless twists and turns that have made LQDN what it is today.
Read more
-
Regulating Border Tech Experiments in a Hostile World
We are facing a growing panopticon of technology that limits people’s movements, their ability to reunite with their families, and at the worst of times, their ability to stay alive. Power and knowledge monopolies are allowed to exist because there is no unified global regulatory regime governing the use of new technologies, creating laboratories for high-risk experiments with profound impacts on people’s lives.
Read more
-
EU’s AI proposal must go further to prevent surveillance and discrimination
The European Commission has just launched the EU draft regulation on artificial intelligence (AI). AI systems are being increasingly used in all areas of life – to monitor us at protests, to identify us for access to health and public services, to make predictions about our behaviour or how much ‘risk’ we pose. Without clear safeguards, these systems could further the power imbalance between those who develop and use AI and those who are subject to them.
Read more
-
Artificial Intelligence and Fundamental Rights: Document Pool
Find in this doc pool all EDRi analyses and documents related to Artificial Intelligence (AI) and fundamental rights
Read more
-
Evidence shows a European future that is dystopian: #ReclaimYourFace now to protect your city
The latest evidence shows that biometric mass surveillance is rapidly being developed and deployed in Europe without a proper legal basis or respect for our agency as self-determined and autonomous individuals. No one is safe, as our most sensitive data like our faces, eyes, skin, palm veins, and fingerprints are being tracked, traced and analysed on social media, in the park, on the bus, or at work.
Read more
-
Stop Spying on Asylum Seekers!
How would you feel if the government was literally able to cut off your access to your cash, because your buying habits were deemed suspicious? That's the reality for many UK based asylum seekers, spied on by the Home Office through their 'Aspen Card', the debit payment card they rely on for their basic subsistence and survival. Join our member Privacy International in their efforts to stop the government's harmful practices of spying on some of the most vulnerable members of our society.
Read more
-
No faces left to hack: #ReclaimYourFace Now!
We cannot let power-hungry and profit-orientated technologies manipulate our future, take away our dignity and treat us like walking, breathing barcodes. We have the right to exercise our autonomy and self-determination free from abusive practices undermining our agency. The Reclaim Your Face’s ECI empowers Europeans to move and shape the public debate on the use of these AI-powered biometric technologies. The EU has the chance to show that people sit at the center of its values, by taking the lead to ban biometric mass surveillance that endangers our freedoms, democracies and futures.
Read more
-
Campaign against surveillance: Nobody will tell you when they will follow you
The rapid growth of new technologies has been of “benefit” to secret services. However, it seems that the law has lacked behind showing its inability to reflect the new methods of surveillance used by secret services around the world. EDRi's member Panoptykon Foundation has launched a campaign in Poland to show the problem of unscrutinised powers of secret services.
Read more