CCTV camera and footage

Live facial-recognition open to ‘reckless’ use, ICO warns

Image credit: Dreamstime

The Information Commissioner has published a blog post expressing concern about the potential for misuse of live facial-recognition (LFR) in public spaces and warning that regard must be given to data protection as its use expands.

LFR tools such as Amazon’s Rekognition allow law enforcement and other organisations to record the faces of passers-by in public spaces and automatically search for matches against a database of faces of people of interest, such as missing people or fugitive criminals.

The UK Information Commissioner Elizabeth Denham has warned of the potential misuse of LFR in public places, commenting that the technology could be used “inappropriately, excessively, or even recklessly”.

In a blog post about her new Commissioner’s Opinion, she explained her concerns about using LFR to automatically collect biometric data in public areas. Investigations found numerous unjustified uses of LFR, including generating biometric profiles to target people with personalised advertising, and none of the uses investigated were fully compliant with data-protection law.

“I am deeply concerned about the potential for LFR to be used inappropriately, excessively, or even recklessly. When sensitive personal data is collected on a mass scale without people’s knowledge, choice, or control, the impacts could be significant,” Denham wrote. “We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take.”

She suggested that in the future, it may be that CCTV cameras are combined with LFR and social media data: “LFR is supercharged CCTV.”

The Commissioner’s Opinion [PDF] – which is rooted in law and informed by her investigations – explains how data protection and privacy must be at the heart of decisions to deploy LFR. It says that organisations must demonstrate high standards of governance and accountability from the outset, including being able to justify that their use of LFR is “fair, necessary, and proportionate in each specific context in which it is deployed” and that less intrusive approaches are insufficient for that context. It also says that organisations must assess the risk of using this intrusive technology, including considering issues around accuracy and bias. It does not concern the use of LFR by law enforcement.

Speaking to the PA news agency, Denham added: “We’re at a crossroads right now, we in the UK and other countries around the world see the deployment of LFR and I think it’s still at an early enough stage that it’s not too late to put the genie back in the bottle.”

Last year, companies including IBM terminated or paused work on LFR while the EU proposed a five-year moratorium on the deployment of LFR in public spaces. This backlash was in part influenced by outrage against racially aggravated police brutality; commercial facial-recognition systems have been repeatedly shown to fail people with dark skin. For instance, a 2018 study by the ACLU found that Amazon’s Rekognition misidentified 28 federal US legislators as criminals, disproportionately affecting Black and Latino legislators.

In the UK, a civil rights campaigner took South Wales Police to court over deployment of AFR in public spaces around Cardiff. Court of Appeal judges ruled that the use of technology in this context was unlawful. While this did not prevent the force from using LFR, it means that changes must be made to how it is deployed.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles