Controversial testing of facial recognition software in Germany
At the end of August 2017, German police has been testing a facial recognition software at Südkreuz train station in Berlin. The system was tested on 300 volunteers. The goal was to evaluate the accuracy of the software in recognising and distinguishing them from the crowd – a feature that the police hopes to ultimately use to track and arrest crime and terrorism suspects.
However, this testing has been subject to criticism regarding its parameters and its efficiency in the fight against terrorism. The experiment raises two concerns: the terms of the experiment and the relevance of such a measure against terrorism.
In the aftermaths of recent terrorist attacks, mass surveillance measures have been increasingly introduced in Europe, as a means to “fight against terrorism”. These measures might give citizens the impression that the government is taking action, but there is no evidence that they are efficient towards this goal.
By using facial recognition software, Thomas de Maizière, the German Minister of the Interior, aims at strengthening the public’s sense of security and help the fight against terrorism. He considers that it does not undermine civil liberties, but lawyers and civil society organisations disagree, first and foremost on the terms of the experiment. The facial recognition software was tested on volunteers, who carried around bluetooth sensors transmitting information about their location. German EDRi member Digitalcourage reported that these sensors provide information that is not useful for the results of the experiment and that it was not communicated to the volunteers. Furthermore, Digitalcourage affirms that this data is easily accessible by anyone.
Beyond the technical issues and the lack of consent, it has been denounced by lawyers as unconstitutional and uncalled for, because it costs more in terms of civil rights than it can bring to the fight against terrorism. The usefulness of mass surveillance in improving security is questionable, to say the least. The fact that those involved in recent terrorist attacks were known by the intelligence services and had previously been under surveillance did not stop the attacks. It would require immense resources to constantly follow all potential suspects. It is difficult to see how introducing tools such as facial recognition in public places to widen the scope of surveillance, and thus increasing the amount of data to be processed by law enforcement, could help preventing future terrorist attacks.
Facial recognition at the Südkreuz station: Federal police did not inform correctly – We request the end of the experiment
https://digitalcourage.de/blog/2017/gesichtsscan-beenden
Berlin starts controversial test of facial recognition cameras at train station (02.08.2017)
https://www.thelocal.de/20170802/berlin-launches-controversial-test-of-facial-recognition-cameras-at-train-station
German police test facial recognition cameras at Berlin station (01.08.2017)
https://www.reuters.com/article/us-germany-security/german-police-test-facial-recognition-cameras-at-berlin-station-idUSKBN1AH4VR
Opinion: Facial recognition tech makes suspects of us all (31.08.2017)
http://gearsofbiz.com/opinion-facial-recognition-tech-makes-suspects-of-us-all/37827
Germany’s facial recognition pilot program divides public (24.08.2017)
http://www.dw.com/en/germanys-facial-recognition-pilot-program-divides-public/a-40228816
Facial recognition software to catch terrorists being tested at Berlin station (02.08.2017)
http://www.telegraph.co.uk/news/2017/08/02/facial-recognition-software-catch-terrorists-tested-berlin-station/
Facial recognition cameras at Berlin station are tricking volunteers, activists claim (23.08.2017)
https://www.thelocal.de/20170823/berlins-facial-recognition-cameras-criticized-for-collecting-more-data-than-necessary
(Contribution by Anne-Morgane Devriendt, EDRi intern)