Facial recognition legal challenge defeated in the High Court
Image credit: Dreamstime
A legal challenge launched against the use of automated facial recognition (AFR) technology has been rejected by the High Court.
Ed Bridges, who sought the challenge with help from civil rights group Liberty, has said he will continue the fight and appeal the decision.
Bridges initially adopted the cause because he believed his face was scanned by the South Wales Police Force at a peaceful anti-arms protest while doing his Christmas shopping in 2017.
His lawyers argued the use of the technology caused him “distress” and violated his privacy and data protection rights by processing an image taken of him in public.
But his case was dismissed on Wednesday by two leading judges, who said the use of the technology was not unlawful.
“South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent,” Bridges said.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
South Wales Police chief constable Matt Jukes said: “I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern.
“So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme.
“With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach.
“There is, and should be, a political and public debate about wider questions of privacy and security.
“It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.
“So, this decision is welcome but, of course, not the end of the wider debate.
“I hope policing will be supported by that continuing in an informed way with bodies such as Liberty who brought this action and government each playing their valuable role.”
Lord Justice Haddon-Cave and Justice Swift said they were told by lawyers during a three-day hearing that Bridges’ case was the first time any court in the world had considered the use of AFR.
The judges concluded that they were “satisfied” the current legal regime is adequate to “ensure appropriate and non-arbitrary use of AFR” and that the force’s use to date of the technology has been “consistent” with human rights and data protection laws.
In his opening remarks, Lord Justice Haddon-Cave said: “The algorithms of the law must keep pace with new and emerging technologies.
“The central issue is whether the current legal regime in the United Kingdom is adequate to ensure the appropriate and non-arbitrary use of AFR in a free and civilised society.
“At the heart of this case lies a dispute about the privacy and data protection implications of AFR."
The decision was relayed over video link from the High Court in London to the High Court in Cardiff, where the case was heard in May.
Silkie Carlo, director of Big Brother Watch (BBW), said via a press statement: "People in Wales have been let down and are pleased that Mr. Bridges intends to appeal this profoundly disappointing judgment, which failed to grasp the intrusive nature of this technology".
Carlo also said that if they do decide to use live facial recognition surveillance again, "we will take them to court".
"The independent review into their use of the surveillance was utterly damning and found it was both staggeringly inaccurate and highly likely to be found unlawful," she said.
Cressida Dick, the present commissioner of the Metropolitan Police Service (MPS), warned of the risk of new surveillance technologies being used to create an omniscient police state. BBW commented that this would be ironic because live facial recognition is arguably the most oppressive surveillance technology the Met uses in public. "We urge her to focus on more effective and democratic policing methods and drop live facial recognition," Carlo said.
BBW also argues that "there still has not been a single debate in the House of Commons on live facial recognition nor a single British law that contains the words facial recognition, yet we now have an epidemic of the surveillance across the country".
The human rights think tank also said that live facial recognition doesn’t fit with a democracy and it wowed to "fight [it] until it is banned".
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.