Amazon slaps one-year moratorium on police use of Rekognition
Image credit: Reuters
Amazon will implement a year-long moratorium on police use of its facial recognition technology, stating that it hopes that this will give lawmakers time to introduce regulations for ethical use of the technology.
Amazon’s cloud-based Rekognition platform was launched in 2016 and includes features such as unsafe object detection, human tracking and detection of facial attributes like emotion. It has been adopted by many US police forces (including in Florida and Oregon) to identify persons of interest, as well as by private organisations.
San Francisco became the first US city to ban police use of facial recognition technology last year.
Amazon stated it would ban police use of Rekognition for one year. It said that Congress appears ready to “put in place stronger regulations to govern the ethical use of facial recognition technology” and that it hopes the moratorium will allow enough time to implement these regulations.
The statement added that Amazon will continue to allow other organisations to use its facial recognition technology, including organisations which use the technology for child protection purposes. The short statement may leave open the possibility for Amazon to offer the technology to other government agencies, such as Immigration and Customs Enforcement.
Amazon did not explicitly acknowledge the ongoing protests following the killing of George Floyd by Minneapolis police officers, which have renewed outrage against racism and misconduct in law enforcement. The American Civil Liberties Union has called for surveillance technology such as facial recognition and drones to be kept away from protests and the wider community.
In addition to fear that facial recognition use by police infringes on civil liberties, there are concerns about bias (including racial bias) in commercial facial recognition software.
Notably, a 2018 MIT paper written by Joy Buolamwini and her colleagues found that commercial facial recognition software performs much more poorly when assessing female faces and dark-skinned faces, with error rates ranging from 20-30 per cent for darker female faces. The researchers hypothesised that this is principally due to training data lacking faces from these groups of people.
Buolamwini and her colleagues have been backed by other academics, with dozens defending them in an open letter after two senior Amazon figures wrote a series of blog posts attempting to discredit their work identifying bias in Rekognition. Amazon has consistently defended the accuracy of its facial recognition technology and last year rejected a vote to block its sale to governments.
Following Amazon's announcement this week, Buolamwini tweeted: “This is a collective effort by not only researchers but also civil liberties organisations, activists, employees and shareholders applying pressure coupled with the tragic death of George Floyd and tardy corporate acknowledgement that Black Lives Matter.”
Evan Greer, deputy director of digital rights group Fight for the Future characterised Amazon’s announcement as “nothing more than a public relations stunt” but also an indication that facial recognition technology is becoming “politically toxic”.
“Amazon knows that facial recognition software is dangerous. They know it’s the perfect tool for tyranny. They know it’s racist and that in the hands of police it will simply exacerbate systemic discrimination in our criminal justice system,” Greer said. She opined that Amazon will push for industry-friendly regulations which assuage public concern without damaging their bottom line.
Amazon’s announcement comes just one day after IBM announced that it would terminate its “general purpose” facial recognition business. IBM’s statement said that vendors of AI tools such as facial recognition had the responsibility to ensure that it is tested for bias.
It is worth noting that use of facial recognition technology in public spaces is likely to be of limited use in the immediate future due to widespread mask-wearing as a public health measure.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.