Facial recognition technology ‘fails’ at Notting Hill Carnival
Image credit: Reuters/Eddie Keogh
A trial of facial recognition technology by police monitoring the Notting Hill Carnival this year has led to incorrect matches and an erroneous arrest, a civil rights group claims.
The automated facial recognition system is capable of identifying people from live CCTV footage who could match photographs of faces stored in the extensive database, which is thought to contain more than 20 million images. Police have argued that the technology is 95 per cent accurate.
In 2012, two British people went to the High Court arguing that the Metropolitan Police must delete their photos from the growing database of faces. The High Court ruled that the retention of the images was unlawful, although the police force has insisted that the storage of images on its national database complies with the Data Protection Act.
In February, following a Home Office review, the Home Secretary Amber Rudd ordered police to delete millions of images of innocent people from the database.
The Notting Hill Carnival, which was subject to more extensive policing than usual this year, was used to trial the facial recognition technology. A database of more than 500 images was compiled for the occasion: some of these people were to be arrested and others banned from attending.
The real-world trial of the yet-unregulated technology proved mostly a failure, however, with approximately 35 incorrect matches, leading to one incorrect arrest and five interventions.
The one person correctly identified by the technology had already been arrested for rioting and passed through the justice system by the time they were apprehended, and should not have been included in the database of suspects.
In a blog post by Silkie Carlo, technology policy officer for Liberty, the human rights pressure group, the trial of the technology was described as a “worrying inaccurate and painfully crude facial recognition operation”.
Carlo, who observed the technology being trialled during the carnival, wrote that the system could not differentiate between men and women, and had not been tested for racial bias. According to Carlo, the project leader described the trial as a “resounding success”.
“It didn’t seem to register – or maybe matter – that the arrest was erroneous, that it had come at the price of the biometric surveillance of two million carnival-goers and considerable police resource, or that innocent people had been wrongly identified,” she wrote.