Automating Injustice

Join Fair Trials on 9 September to discuss their new report, Automating Injustice, on the use of these systems and their harmful impact. Speakers with first-hand experience of the harm these systems cause, experts on AI and automated systems in criminal justice, and leading European policymakers will be joining the event.

Save the date:

  • 9 September
  • 17:00 CEST
  • Information & registration: https://www.fairtrials.org/automating-injustice

Artificial intelligence (AI) and automated decision-making (ADM) systems are increasingly used by European law enforcement and criminal justice authorities to predict and profile people’s actions and assess their ‘risk’ of criminality or re-offending in the future.

These AI and ADM systems can reproduce and reinforce discrimination and exacerbate inequality on grounds including but not limited to race, socio-economic status and nationality, as well as engage and infringe fundamental rights, including the right to a fair trial and the presumption of innocence.

Join Fair Trials on 9 September to discuss their new report, Automating Injustice, on the use of these systems and their harmful impact. Speakers with first-hand experience of the harm these systems cause, experts on AI and automated systems in criminal justice, and leading European policymakers will be joining the event.

Chair: Laure Baudrihaye-Gérard, Legal Director (Europe), Fair Trials

Panellists:

Diana Sardjoe, Founder, De Moeder is de Sleutel (The Mother is the Key), mother of children impacted by risk modelling and profiling systems

Petar Vitanov MEP (S&D) (Bulgaria), Rapporteur of LIBE committee AI in criminal matters report

Sarah Chander, Senior Policy Advisor, European Digital Rights (EDRi)

James MacGuill, Solicitor and Vice-President of the CCBE

Griff Ferris, Legal & Policy Officer, Fair Trials (presenting the Automating Injustice report)