Clearview AI fined £7.5m for illegally collecting UK citizens' data
Image credit: Photo 157269946 © Denisismagilov | Dreamstime.com
Facial-recognition technology firm Clearview AI will be forced to pay a £7.5m fine and delete all data gathered from people in the UK after an investigation into its practices.
Controversial facial-recognition firm Clearview AI received a multimillion-pound fine for building a database of more than 20 billion images without informing people or gaining their consent.
The Information Commissioner’s Office (ICO) found the company did not have a lawful reason for collecting people’s information and failed to adequately inform UK residents over the use of their personal data. As a result, Clearview AI has been fined £7,552,800 and ordered to delete all data gathered from people in the UK.
Clearview AI first came under fire in 2020 after its database - built from scraping billions of publicly available images from social media - suffered a security breach. However, privacy advocates have long condemned the company's business model, based on allowing its clients to upload an image of a person to the company’s app and check it for a match against all photos in the database.
“The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service," said Information Commissioner John Edwards. "That is unacceptable."
The ICO has issued an enforcement notice ordering Clearview AI to stop obtaining and using the personal data of UK residents that is publicly available on the internet and to delete the data of UK residents from its systems. The decision was taken as a result of a joint investigation with the Office of the Australian Information Commissioner (OAIC), which focused on Clearview AI Inc’s use of people’s images, data scraping from the internet and the use of biometric data for facial recognition.
Although the company no longer has clients in the UK, the ICO concluded that Clearview AI's database is likely to include a substantial amount of data from UK residents, which the company would be offering to its clients in other countries. It also failed to have "a lawful reason for collecting people’s information," the report read.
“We have acted to protect people in the UK by both fining the company and issuing an enforcement notice," Edwards said.
“People expect that their personal information will be respected, regardless of where in the world their data is being used. That is why global companies need international enforcement."
In the past, Clearview AI founder Hoan Ton-That has maintained that the company has a right to public information and that its tool is invaluable in fighting crime. Its facial-recognition technology has been recently used to identify war casualties in Ukraine, as well as rioters who stormed the US Capitol building in 2021. However, the EU has recently called for a ban on police use of this technology, as it was found to breach privacy rules.
Other countries have also begun to take similar action to ensure the ethical use of facial-recognition technologies. Italy and France have both fined Clearview AI for its use of citizen 's data. Moreover, Clearview AI recently signed a settlement with the the American Civil Liberties Union (ACLU), in which it agreed to stop selling its technology to private actors in the US, ending a two-year data privacy lawsuit.
“International co-operation is essential to protect people’s privacy rights in 2022," said Edwards.
The collaboration between the ICO and the OAIC is a great first step in this direction. Later this week, Edwards will meet with EU regulators to continue working towards tackling threats to global privacy.
Currently, there is no specific legal framework in the UK regulating the use of facial-recognition technologies, although the government is expected to introduce new data protection legislation - deviating from the EU's Data Protection Act - by the end of this year.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.