Blogs | Privacy and data protection | Biometrics

Amazon’s Rekognition shows its true colors

By EDRi · January 15, 2020

EDRi member Bits of Freedom has been investigating the problems associated with the use of facial recognition by the police in the public space. As part of this investigation they wanted to put this technology to the test themselves. How does facial recognition technology really work?

Digital tourism

On Dam Square, in the center of Amsterdam, you’ll find a camera. It’s no ordinary security camera: this camera broadcasts images of Dam Square, 24 hours a day, 7 days a week, in extremely high quality on YouTube. The camera can zoom in from one side of the square straight to the other side. As we’re told on the supplier’s website: this is good for “digital tourism”. The camera is also good for the investigation: it offers the perfect opportunity to test facial recognition technology. How bizarre is it that the thousands of people that cross Dam Square every day, can, without their knowledge, also be seen on YouTube? And an even scarier vision of the future: what if they could all be registered using facial recognition technology?

Amazon’s Rekognition

“Rekognition”, Amazon’s facial recognition technology, is used by various police units in the United States, and can be used directly via the internet by anyone with a credit card. Bits of Freedom investigated whether this program would recognise anyone visiting Dam Square.

We uploaded a picture as a test, to allow the software to become familiar with one specific face. The software subsequently located different characteristics in the face, so-called “landmarks”, such as the lower tip of the chin, the nostrils, the pupils and the jawline.

But Rekognition creates even more data regarding our test face. It estimates her age, whether she’s laughing, whether she’s wearing (sun)glasses, what her gender is and other facial features such as if the individual has a beard or a moustache. In addition to this, Rekognition also registers “emotions”. According to it the individual is happy, but also a little bit anxious (that’s right – our test case does not love surveillance cameras).

The first encounter is behind us. Now that Rekognition claims to know our test case, it’s time for our first test. We sent our individual to Dam Square and to the camera’s field of vision.

Bingo! Rekognition recognized her. With 100 percent certainty (rounded off), Rekognition recognized the face as a face, and Rekognition was 90 percent sure that this face matched the previously uploaded picture. But the software also saw that we had brought someone along. Rekognition was convinced that this other person was not the same individual

Very little is needed to recognize a face.

A grainy picture is all Rekognition needs to recognise anyone. Very little! The picture we used was taken from the internet. Nowadays almost everyone has a picture of themselves online. And if you are a little handy with computers, you can “teach” Rekognition multiple faces. This means that, theoretically, the perfect stalker tool can be developed, especially if you link this to multiple cameras that broadcast their images on the internet (and those exist). It is not possible to have any control over this. You do not know what others are doing with this technology and maybe even with your pictures.

Facial recognition technology is a mass surveillance tool

After two months of research into facial recognition technology, a feeling of disbelief dominates: why does this technology exist? Facial recognition can easily be used as a mass surveillance tool that makes it possible to continuously spy on and manipulate groups and individuals on a scale and with a speed that was previously impossible. Our insatiable hunger for greater efficiency and convenience means that we are losing sight of the fact that this technology violates the rights and freedoms of citizens. The use of facial recognition technology must be thoroughly debated and researched before it is fully normalised in our society, and people accept the inevitable corrosion of our fundamental rights as necessary for “progress”.

Bits of Freedom
https://www.bitsoffreedom.nl/

Amazon’s Rekognition shows its true colors (12.12.2019)
https://www.bitsoffreedom.nl/2019/12/12/amazons-rekognition-shows-its-true-colors/

(Contribution by Paula Hooyman, EDRi member Bits of Freedom, the Netherlands)