
Data watchdog issues warning over police use of facial recognition
Image credit: Dreamstime
In a blog post, the Information Commissioner Elizabeth Denham has warned that there remain significant issues around privacy in police trials of live facial-recognition (LFR) technology.
Facial-recognition technology has been deployed by law enforcement agencies around the world. In the UK, the technology has been trialled by London’s Metropolitan Police and South Wales Police. Police forces hope that the technology could allow them to identify persons of interest by matching the faces of people passing through public places with faces stored in a database.
These trials have been controversial, with deployment in London’s East End attracting protests from civil liberties group Liberty.
Last week, it was revealed that of the 42 ‘matches’ made during a public Met trial of the technology, just eight of the matches were correct. Human Rights experts warned that the use would be likely to be judged unlawful if challenged in court. In May 2018, Denham said that she could take action against police forces if concerns about the technology were not adequately addressed.
In a blog post, Denham - whose office is investigating the use of LFR - has detailed some of these issues. Denham has acknowledged the aims of deploying facial-recognition technology, but advised police forces to do more to demonstrate their compliance with data protection law, including how ‘watch lists’ of people of interest are compiled and which images are used.
“These trials […] represent the widespread processing of biometric data of thousands of people as they go about their daily lives and that is a potential threat to privacy that should concern us all,” she wrote. “I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR.”
“There remain significant privacy and data protection issues that must be addressed and I remain deeply concerned about the rollout of this technology,” she continued.
Denham also warned of the likelihood of certain inbuilt biases. Research has shown that commercial facial-recognition software performs poorly for women and people with darker skin. Replicating these biases in the use of LFR by law enforcement could see more false positive matches for groups of people disproportionately targeted by the police.
In May 2018, a court case began which could influence the future of LFR in the UK. Cardiff resident Ed Bridges, who is represented by Liberty, has argued that South Wales Police violated his privacy and data protection rights by processing images taken of him in public. Bridges argued that the use of LFR was a breach of human rights law. South Wales Police argued that LFR does not violate Bridges’ rights as it is used in the same way as photographing a person in public and that the organisation does not retain the data of those not on its watch list.
Denham has said that the resulting judgement will form an important part of her office’s investigation into LFR. For the time being, she has advised that data protection impact assessments should be carried out before each deployment. Organisations deploying the technology should also produce appropriate policy documentation explaining exactly how and why it is being used and should ensure that the algorithms do not treat the race or sex of individuals unfairly.
Hannah Couchman, policy and campaigns officer at Liberty, said: “The Information Commissioner is right to highlight how invasive this technology is. Facial recognition is a discriminatory mass surveillance tool which is more likely to misidentify people of colour and women. It violates our privacy and forces us to self-censor – eroding our freedom to choose where we go and who we go with. It has no place on our streets.”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.