Hoan Ton-That, the CEO of the facial recognition business Clearview AI, began thinking about how he could help after Russia invaded Ukraine and images of the devastation inundated the news.
He believed that his company’s technologies may provide clarity in wartime scenarios.
“I recall watching videos of Russian soldiers being caught and Russia claiming they were actors,” Ton-That added. “I reasoned that by allowing Ukrainians to utilize Clearview, they would be able to obtain more information to validate their identities.”
He sought out to persons who could assist him in contacting the Ukrainian administration in early March. Lee Wolosky, a lawyer who had worked for the Biden administration and a member of Clearview’s advisory board, was meeting with Ukrainian authorities and offered to send a message.
Ton-That wrote a letter explaining that his app “can instantaneously identify someone only from a photo” and that it has been used to solve crimes by police and federal agencies in the United States. Clearview has been criticized for this function, which has raised privacy concerns as well as problems about racism and other biases in artificial intelligence systems.
Ton-That said that the gadget, which can identify a suspect recorded on surveillance video, could be useful to a country under attack. By comparing their faces to Clearview’s database of 20 billion faces from the public web, including “Russian social sites such as VKontakte,” he said the technology might identify persons who might be spies, as well as deceased people.
Ton-That decided to offer Clearview’s services to Ukraine for free, as reported earlier by Reuters. Now, less than a month later, the New York-based Clearview has created more than 200 accounts for users at five Ukrainian government agencies, which have conducted more than 5,000 searches. Clearview has also translated its app into Ukrainian.
“It’s been an honor to help Ukraine,” said Ton-That, who provided emails from officials from three agencies in Ukraine, confirming that they had used the tool. It has identified dead soldiers and prisoners of war, as well as travelers in the country, confirming the names on their official IDs. The fear of spies and saboteurs in the country has led to heightened paranoia.
According to one email, Ukraine’s national police obtained two photos of dead Russian soldiers, which have been viewed by The New York Times, on March 21. One dead man had identifying patches on his uniform, but the other did not, so the ministry ran his face through Clearview’s app.
READ ALSO: South Korea will release an additional 7.23 million barrels of oil reserves in response to the Ukraine crises
The app surfaced photos of a similar-looking man, a 33-year-old from Ulyanovsk who wore a paratrooper uniform and held a gun in his profile photos on Odnoklassniki, a Russian social media site. According to an official from the national police, attempts were made to contact the man’s relatives in Russia to inform them of his death, but there was no response.
Identifying dead soldiers and notifying their families is part of a campaign, according to a Telegram post by Ukrainian Vice Prime Minister Mykhailo Fedorov, to break through to the Russian public the cost of the conflict and to “dispel the myth of a ‘special operation’ in which there are ‘no conscripts’ and ‘no one dies,’” he wrote.
Images from conflict zones of slaughtered civilians and soldiers left behind on city streets turned battlefields have become more widely and instantaneously available in the social media era. President Volodymyr Zelenskyy of Ukraine has shown graphic images of attacks on his country to world leaders in making his case for more international aid. But beyond conveying a visceral sense of war, those kinds of images can now offer something else: a chance for facial recognition technology to play a significant role.
Critics warn, however, that the tech companies could be taking advantage of a crisis to expand with little privacy oversight and that any mistakes made by the software or those using it could have dire consequences in a war zone.
Evan Greer, a deputy director for the digital rights group Fight for the Future, is opposed to any use of facial recognition technology and said she believed that it should be banned worldwide because governments had used it to persecute minority groups and suppress dissent. Russia and China, among others, have deployed advanced facial recognition technology in cameras in cities.
“War zones are often used as testing grounds — not just for weapons, but surveillance tools that are later deployed on civilian populations or used for law enforcement or crowd control purposes,” Greer said. “Companies like Clearview are eager to exploit the humanitarian crisis in Ukraine to normalize the use of their harmful and invasive software.”