Data watchdog to investigate facial-recognition tech around King’s Cross
Image credit: Dreamstime
The Information Commissioner’s Office (ICO) has announced that it is opening an investigation into the deployment of facial-recognition software around the King’s Cross Central site in North London.
The 67-acre site contains 50 buildings, including the King’s Cross and St Pancras International railway stations, plus a number of shops and restaurants. The site is owned and managed by a group comprised of property developer Argent, Hermes Investment Management and pension scheme AustralianSuper.
The group deployed an unknown number of cameras across the site. These capture footage of people passing through the area, which is then analysed with facial-recognition software. Agent previously told the Financial Times that: “These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public.”
Earlier this week, Mayor of London Sadiq Khan stated that he had contacted the chief executive of the King’s Cross Central site to explain his concerns about the deployment of facial recognition in this public space.
The technology has been deployed by security and law enforcement bodies across the world, although it remains highly controversial on account of its constant and unavoidable surveillance of ordinary people. Researchers have also demonstrated that commercial facial-recognition software tends to perform poorly when analysing the faces of women and people with darker skin.
This week, the ACLU demonstrated that commercial facial-recognition software is prone to making incorrect matches by analysing California lawmakers with Amazon’s Rekognition software. The group found that the software identified more than one in five of the lawmakers as criminals (with ethnic minorities being disproportionately affected).
In July, the House of Commons Science and Technology Committee advised that public authorities suspend trials of facial-recognition technology until a legal framework is established and raised further concerns about the ongoing storage of police custody images of individuals not convicted of any crime.
The ICO has confirmed that it will be investigating the use of facial recognition at the King’s Cross site in London.
“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” said Elizabeth Denham, the Information Commissioner. “That is especially the case if it is done without the people’s knowledge or understanding.
“I remain deeply concerned about the growing use of facial-recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used.”
Denham states that the investigation would collect detailed information from the site managers about how the technology is used, as well as inspecting the system and its day-to-day operations to assess whether it violates data protection law and that it is “legal, proportionate and justified.”
Meanwhile, civil liberties group Big Brother Watch has concluded an investigation, which they say demonstrates that there is an “epidemic” of facial-recognition technology use on privately owned sites such as Meadowhall shopping centre in Sheffield and the World Museum in Liverpool.
“Facial-recognition surveillance risks making privacy in Britain extinct,” commented Silko Carlo, chief executive of Big Brother Watch.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.