Facial recognition may never be appropriate for police use, says ethics board
Axon – the company best known for the Taser incapacitation device – has announced that it will not be adding facial recognition technology to its police body cameras, based on the recommendations of its AI and ethics board.
Axon has shifted its business away from the Taser and other weaponry in recent years to expand into other areas of policing technology, including immersive training, cloud platforms for policing and body cameras.
In April 2018, the company established an AI and Policing Technology Ethics Board of 11 experts to advise the company on how to develop new AI products and services.
The board has now published its first report, making several suggestions. Most notably, it recommends that “face-recognition technology is not currently reliable enough to ethically justify its use on body-worn cameras” and advises strongly against any consideration of introducing it now or potentially ever.
It warns that introducing this sort of technology would require it to be able to perform equally well across different ethnicities, genders and other identity groups. Commercial facial-recognition technology has been found to fail women and people with dark skin tones.
The board also recommended that police officers should not be able to use customised facial-recognition technology if it is ever included as a future Axon product, in order to prevent abuse. It also advised that the technology should only be introduced with the consent and input of those affected and that more focus should be put on reducing false negatives and positives than on the basic metric of ‘accuracy’.
So far, Axon has not done work to match faces in photographs and videos with faces stored in a police database. Its work in facial recognition has been limited to detecting, tracking and re-identifying faces in video footage in order to prepare the footage (by blurring faces of passers-by) before releasing it to the public.
In response to the board’s recommendation (as well as “technological limitations” to integrating this technology into body cameras) the company has announced that it “will not be commercialising face-matching products on [its] body camera”. However, it will continue to research and refine facial-recognition software, including with efforts to reduce bias in the software.
The use of facial-recognition technology by law enforcement has proved controversial, with trials in London attracting criticism from civil liberties and privacy groups. In May, San Francisco city officials voted 8-1 to ban government agencies from using facial-recognition technology, making the tech-friendly hub the first city in the US to introduce a ban on the technology.
In the same month, Amazon shareholders voted against a motion calling for the company to be prevented from selling its controversial facial-recognition tool, Rekognition, to the US government.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.