The Clearview/Ukraine partnership – How surveillance companies exploit war
Clearview announced it will offer its surveillance tech to Ukraine. It seems no human tragedy is off-limits to surveillance companies looking to sanitise their image.
In the midst of the atrocious war currently being waged by Russia on Ukraine, on 14 March 2022 Reuters reported that Clearview AI, the infamous online surveillance company, had offered its services to the Ukrainian defense ministry. A day later in an interview for TechCrunch, Ukraine’s vice prime minister and minister for Digital Transformation confirmed that the partnership with Clearview AI was “currently in very early development”.
Clearview is an online surveillance company that collects all photos it finds on the public Internet, runs them through its facial recognition algorithm and stores them in its searchable database. It then sells access to its database to various clients, most notably law enforcement authorities, who can search the database by uploading photos of subjects of interest and find matching faces and corresponding URLs.
This indiscriminate collection of photos and other personal information, without people’s knowledge or consent, threatens everyone’s rights and freedoms online and offline – and the use of Clearview’s database by authorities is a considerable expansion of the realm of surveillance, with very real potential for abuse.
This is unacceptable. We filed legal complaints against the company’s activities in 5 countries in May 2021, already resulting in a number of decisions finding that Clearview had violated data protection laws and threatened the exercise of fundamental human rights.
To date, the data protection authorities of Canada, Australia, the UK, France and most lately Italy have all issued findings of breach and ordered Clearview to delete all photos collected of people in their territories – with Italy even imposing the maximum €20 million GDPR fine. Some authorities have also found it unlawful for the police to use Clearview’s technology, notably Sweden, Canada, and Belgium.
PI is gravely concerned about Clearview’s partnership with Ukraine. Whatever the intention, people in Ukraine are currently at their most vulnerable – offering to deploy controversial technologies that exploit personal data seems irresponsible and on the verge of exploiting people’s distress and despair. As Clearview’s data collection has been found in violation of many countries’ privacy laws, this feels like trying to bandage a wound with an infected plaster.
The risks and perils of facial recognition and online surveillance have been extensively aired, and in a war context the potential consequences would be too atrocious to be tolerated – such as mistaking civilians for soldiers, or Ukrainians for Russian soldiers. And what if the Russian government used the opportunity to manipulate online pages and results to its own advantage? Clearview boasts about having 2 billion images from the Russian social media platform VKontakte in its database – in the context of Putin’s vast record of perpetrating online manipulation, this is highly worrying.
The use of this technology in a war context is unprecedented, and Clearview has given no assurances as to whether they’ve thought through the risks. In times and places of peace, their technology is greatly controversial, and as we’ve seen above, unlawful. Even the most careful safeguards we can establish in times of peace and stability, vanish when faced with the lawlessness and unpredictability of war. The risks are simply too high. We call for Clearview to do the right thing for once, and withdraw their offer.
From Covid pandemic to the war in Ukraine, it seems no human tragedy is off-limits to surveillance companies looking to sanitise their image.
One only has to look at the increasing crackdown on dissent in Russia to understand why mass surveillance technologies are such a threat to democracy and why they must be reined in. You can’t fight totalitarianism by adopting its infrastructure of control.
This article was first published here.
Image credits: Max Kukurudziak on Unsplash.
(Contribution by: EDRi member Privacy International)