#ProtectNotSurveil: The EU AI Act fails migrants and people on the move

The #ProtectNotSurveil coalition is calling attention to how the EU AI Act - adopted by the European Parliament on March 13 - is failing to prevent harm and provide protection for migrants and people on the move.

By EDRi · March 13, 2024

On March 13, the EU Artificial Intelligence (AI) Act was adopted by the European Parliament. Whilst it is widely celebrated as a world-first, the EU AI Act falls short in the vital area of migration. The #ProtectNotSurveil coalition is calling attention to how this legislation is failing to prevent harm and provide protection for people on the move.

Why is the AI Act failing migration?

While the AI Act may have positive aspects in other areas, it is weak and even enables the use of risky AI systems when it comes to migration. Moreover, it creates a dangerous precedent by creating a parallel legal framework for the use of AI by law enforcement, migration and national security authorities, thus exempting such uses from the rules and safeguards within the AI Act.

Following are some of the notable areas where the AI Act falls short on preventing harm against people on the move and migrants:

  • Prohibitions on AI systems do not extend to the migration context
  • The list of high-risk systems fails to capture the many AI systems used in the migration context
  • AI used as part of EU large-scale databases in migration, such as Eurodac, the Schengen Information System, and ETIAS will not have to be compliant with the Regulation until 2030.
  • It did not address how AI systems developed by EU-based companies impact people outside the EU, despite existing evidence of human rights violations in third countries facilitated by surveillance technologies developed in the EU
  • It creates a parallel legal framework when AI is deployed by law enforcement, migration and national security authorities

These loopholes and exceptions will harm migrants, racialised and other marginalised communities who already bear the brunt of discriminatory targeting and over-surveillance by authorities.

"The AI Act might be a new law but it fits into a much older story in which EU governments and agencies – including Frontex - have violated the rights of migrants and refugees for decades. Implemented along with a swathe of new restrictive asylum and migration laws, the AI Act will lead to the use of digital technologies in new and harmful ways to shore up ‘Fortress Europe’ and to limit the arrival of vulnerable people seeking safety. Civil society coalitions across and beyond Europe should work together to mitigate the worst effects of these laws, and continue to towards building societies that prioritise care over surveillance and criminalisation."

Chris Jones, Executive Director of Statewatch (Member of EDRi and the Protect Not Surveil coalition)

What next?

The EU AI Act will take between two to five years to enter into force. In the meantime, harmful AI systems will continue to be tested, developed and deployed in many areas of public life. During this time, the Protect Not Surveil coalition is calling for the following crucial steps to be taken:

  • EU and national level bodies to document and respond to harms stemming from the use of AI in migration and policing contexts, ensuring protection against the violation of peoples’ rights.
  • Civil society to contest further expansion of the surveillance framework, reversing and refusing trends that criminalise, target and discriminate against migrants, racialised and marginalised groups.
  • Everyone to re-evaluate the investment of resources in technologies that punish, target and harm people instead of affirming rights and providing protection.

 

The #ProtectNotSurveil coalition: Access Now, European Digital Rights (EDRi), Platform for International Cooperation on Undocumented Migrants (PICUM), Equinox Initiative for Racial Justice, Refugee Law Lab, AlgorithmWatch, Amnesty International, Border Violence Monitoring Network (BVMN), Digitalcourage, EuroMed Rights, European Center for Not-for-Profit Law (ECNL), European Network Against Racism (ENAR), Homo Digitalis, Privacy International, Statewatch, Dr Derya Ozkul, Dr. Jan Tobias, and Dr Niovi Vavoula

The #ProtectNotSurveil coalition started in February 2023 to advocate for the AI Act to protect people on the move and racialised people from harms emanating from the use of AI systems.