Oxford Circus on a sunny day

Police deploy facial recognition in central London

Image credit: Dreamstime

The Metropolitan Police has deployed live facial-recognition cameras near Oxford Circus, at the heart of London’s busy shopping district.

Cameras mounted on dark blue police vans were spotted near Oxford Circus underground station, next to a sign warning the public that the Metropolitan Police is using live facial recognition. A photograph of the set-up was shared by civil liberties group Big Brother Watch on Twitter.

Silkie Carlo, director of Big Brother Watch, commented: “It’s alarming to see biometric mass surveillance being rolled out in London. Never before have citizens been subjected to identity checks without suspicion, let alone on a mass scale. We’re appalled that Sadiq Khan has approved such useless, dangerous, and authoritarian surveillance technology for London. This undemocratic expansion of the surveillance state must be reversed.”

The City of Westminster Police wrote on Twitter that it would be using facial recognition at some locations in Westminster today, saying: “This technology helps keep Londoners safe. We are using it to find people who are wanted for violent and other serious crimes.”

This appears to be the second time the Metropolitan Police have used the technology in public since it announced in January that it will use facial recognition in its operations to search for perpetrators of “serious and violent crime” (including knife crime) and for children and vulnerable adults, by comparing the faces of passers-by with a watchlist of faces and approaching people who generate a match.

The announcement followed a series of limited but controversial trials in different areas of London.

The decision to roll out facial-recognition cameras for law enforcement purposes has attracted criticism from privacy advocates and human rights experts, who characterise the deployment of the technology as a civil liberties infringement as well as being not sufficiently accurate to serve as a reliable law enforcement tool. Independent research has demonstrated that commercial facial-recognition software tends to serve women and darker-skinned people particularly poorly.

The Metropolitan Police has claimed that the technology has a very low failure rate with just one false match per thousand. However, University of Essex researchers – using different metrics – found that across the six trials, the technology only achieved eight correct matches out of 42 representing a failure rate of more than 80 per cent. A facial-recognition system trialled by South Wales Police at the 2017 Champions League Final in Cardiff was reported to have a failure rate of 93 per cent, generating almost 2,300 false positives.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles