On Saturday, Ukraine’s defence ministry reportedly began using Clearview AI’s facial recognition technology.
The US startup – that has been tinged by controversy – offered Ukraine its AI facial recognition software to uncover Russian assailants, combat misinformation and identify the dead.
Ukraine is reportedly receiving free access to Clearview AI’s powerful search engine for faces, letting authorities potentially vet people of interest at checkpoints, Lee Wolosky, an adviser to Clearview, told news agency Reuters.
After Russia’s invasion of Ukraine last month, Clearview’s co-founder and chief executive, Hoan Ton-That sent a letter to Kyiv offering assistance.
Clearview clarified that it had not offered the technology to Russia.
While Ukraine’s Ministry of Defence did not reply to requests for comment, previously, a spokesperson for Ukraine’s Ministry of Digital Transformation said it was considering offers from US based artificial intelligence companies like Clearview.
The Clearview founder said his startup had more than 2 billion images from the Russian social media service VKontakte at its disposal, out of a database of over 10 billion photos in total.
According to Ton-That, Clearview’s database could help Ukraine identify the dead more easily than trying to match fingerprints and works even if there is facial damage.
His letter also said Clearview’s technology could be used to reunite refugees separated from their families, identify Russian operatives and help the government debunk false social media posts related to the war.
While it’s was unclear what exactly Ukraine’s defence ministry is using the technology for, Ton-That confirmed that other parts of Ukraine’s government are expected to deploy Clearview in the coming days.
The VKontakte images reportedly make Clearview’s dataset more comprehensive than that of PimEyes, a publicly available image search engine that people have used to identify individuals in war photos.
Critics warn that facial recognition could misidentify people at checkpoints and in battle.
‘A mismatch could lead to civilian deaths, just like unfair arrests have arisen from police use’, said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project in New York told Reuters.
Cahn described identifying the deceased as probably the least dangerous way to deploy the technology in war, but he said that ‘once you introduce these systems and the associated databases to a war zone, you have no control over how it will be used and misused’.
‘We’re going to see well-intentioned technology backfiring and harming the very people it’s supposed to help,’ he said.
Ton-That said Clearview should never be wielded as the sole source of identification and that he would not want the technology to be used in violation of the Geneva Conventions, which created legal standards for humanitarian treatment during war.
Like other users, those in Ukraine are reportedly receiving training and have to input a case number and reason for a search before queries.
Why is Clearview controversial?
Clearview, which primarily sells to US law enforcement, is fighting lawsuits in the US accusing it of violating privacy rights by taking images from the web.
In 2020, social media giant Facebook had demanded Clearview stop taking its data.
The company argues that its data gathering process is similar to how Google search works.
Still, several countries including the UK and Australia have deemed its practices illegal.
In 2020, the UK’s Metropolitan Police were found to be using the services of the controversial facial recognition company, in a data breach of Clearview AI’s client list.
Many Western businesses have pledged to help Ukraine, providing internet hardware, cybersecurity tools and other support. SpaceX founder and billionaire Elon Musk even sent over a truck of Starlink terminals to provide satellite broadband service in Ukraine.
Metro.co.uk has reached out to VKontake for comment.
Source: Read Full Article