Retrospective facial recognition surveillance conceals human rights abuses in plain sight
Following the burglary of a French logistics company in 2019, facial recognition technology (FRT) was used on security camera footage of the incident in an attempt to identify the perpetrators. In this case, the FRT system listed two hundred people as potential suspects. From this list, the police singled out ‘Mr H’ and charged him with the theft, despite a lack of physical evidence to connect him to the crime. The judge decided to rely on this notoriously discriminatory technology, sentencing Mr H to 18 months in prison.