Amazon’s facial recognition tech leads to mismatches in ACLU test
Image credit: Dreamstime
A test run by the American Civil Liberties Union (ACLU) has raised questions about the reliability of commercial facial-recognition technology offered by Amazon Web Services.
The software, Rekognition, was launched in 2016. It is based on deep learning – an approach to machine learning which mimics the multi-layered structure of the brain – and allows for real-time facial recognition in photos and videos. The technology was used by Sky News during May’s Royal Wedding to identify celebrity guests in real time.
Now, the ACLU has presented the results of a test it ran using Amazon’s Rekognition. The group paid $12.33 (£9.41) to compare official headshots of every member of the US Senate and House of Representatives against a database of 25,000 criminal mugshots. Rekognition found matches of at least 80 per cent confidence between 28 members of Congress and mugshots of different people.
According to the ACLU, the matches disproportionately affected ethnic minorities. While just 20 per cent of Congress identify as people of colour, 39 per cent of the matches were Black or Latino, including veteran civil rights leader John Lewis.
“Face surveillance is flawed, and it’s biased, and it’s dangerous,” Jacob Snow, an ACLU attorney, told Reuters.
Three Democratic lawmakers who were identified by Rekognition wrote to Amazon CEO Jeff Bezos expressing their concern about the tool’s use, questioning him about its accuracy and requesting copies of internal accuracy or bias assessments performed on Rekognition.
Amazon said that there were issues with the confidence thresholds used during the ACLU’s test (the ACLU used Rekognition’s default match settings), and warned that Rekognition was normally used to assist people in narrowing down possible suspects, rather than making final decisions.
“We remain excited about how image and video analysis can be a driver for good in the world,” said a spokesperson for Amazon Web Services in a statement.
The adoption of facial recognition software in public spaces has become a divisive issue, with civil liberties and privacy activists arguing that it is inaccurate, intrusive, and reflects racial biases. The ACLU has been particularly vocal in its campaigns to stop Amazon selling Rekognition to governments and other authorities, such as law enforcement in Florida and Oregon. Facial recognition is already widely used to assist police in China.
This month, a Microsoft executive called on Congress to consider possible regulations for the use of facial recognition technology given the “potential for abuse” by authorities. Meanwhile, a Google executive has also expressed concern about the shortfalls of current facial-recognition technology. Speaking to BBC News, CEO for Google’s cloud operations Diane Greene, said that facial-recognition technology had not been trained with diverse enough data and carried “inherent biases”.
While Google is working on its own commercial facial-recognition technology, it has not yet been made available.