CCTV camera and footage

Amazon’s facial-recognition tools ‘primed for abuse’ by government, groups argue

Image credit: Dreamstime

A coalition of civil liberties groups has criticised Amazon for providing a facial recognition service to US law enforcement agencies, which they argue could be a “grave threat” to communities.

The American Civil Liberties Union (ACLU) has criticised Amazon for working with law enforcement on facial recognition. According to the ACLU, this could prove a “grave threat” to customers and communities.

Amazon Web Services’ facial-recognition system, which is known as Rekognition, was launched in 2016 and allows for real-time facial recognition in photos and videos. The system is based on deep learning: an efficient approach to machine learning which mimics the complex, multi-layered structure of the human brain.

Last weekend, broadcaster Sky used Rekognition to automatically identify celebrity guests as they arrived at the high-profile wedding of Prince Harry and Meghan Markle.

According to reports, Rekognition was first adopted by Washington County Sheriff’s Office in Oregon to search photographs for suspects previously captured in mug shots and went on to use the tool approximately 20 times a day to identify suspects in surveillance videos. An email obtained by the ACLU states that loading a database of hundreds of thousands of photographs cost $400 (£300) and continuing the service could be costing the organisation as little as $6 (£4.50) per month.

In 2017, the police department of Orlando, Florida, announced that it would be trialling Rekognition for the real-time detection of persons of interest. Amazon has said that it will receive feeds from public safety cameras around Orlando and use Rekognition to identify suspects and notify the organisation when matches emerged.

More recently, policies have been adopted stating that officers may use facial recognition tools in real time to identify suspects without ID, or in critical situations.

The ACLU – along with civil liberties and privacy advocacy groups – is concerned that the use of facial recognition tools in policing could come at the expense of anonymity and privacy. For instance, the combination of body cameras worn by law enforcement officers in the field with facial-recognition tools could allow for people to be tracked in real time when carrying out mundane daily activities. The groups suggest that this technology could be built into a system “to automate the identification and tracking of anyone”.

Concerns also focus on the impact of these tools on minority communities by amplifying existing racial bias in law enforcement. A study published in February found that commercial facial recognition systems are far less effective at identifying even basic traits of darker-skinned faces.

“In the past, Amazon has opposed secret government surveillance. And you [Jeff Bezos, Amazon CEO] have personally supported First Amendment freedoms and spoken out against the discriminatory Muslim Ban. But Amazon’s Rekognition product runs counter to these values,” the ACLU wrote in a letter with 33 other advocacy groups. “As advertised, Rekognition is a powerful surveillance system readily available to violate rights and target communities of colour [communities of racial minorities].”

“Amazon Rekognition is primed for abuse in the hands of governments. This product poses a grave threat to communities, including people of colour and immigrants, and to the trust and respect Amazon has worked to build.”

“Amazon must act swiftly to stand up for civil rights and civil liberties, including those of its own customers and take Rekognition off the table for governments,” the letter concludes.

In response, Deputy Jeff Talbot, a spokesperson for the Washington County Sheriff’s Office, stated that: “We are not mass-collecting. We are not putting a camera out on a street corner. We want our local community to be aware of what we’re doing, how we’re using it to solve crimes – what it is and, just as importantly, what it is not.”

Earlier this month, privacy groups in the UK criticised the use of facial recognition tools in UK policing as “almost entirely inaccurate” due to the number of false positives flagged up by the systems.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close