Welcome Your IET account
A review of surveillance cameras on white background. Security concept. Facial recognition. Program search for criminals.

Facial recognition ‘fails 96 per cent of the time’, says Detroit police chief

Image credit: Andrey Mikhaylov/Dreamstime

Detroit’s police chief has admitted the facial recognition system used by his department fails to identify suspects 96 per cent of the time, with a report revealing it is used almost exclusively against black people.

At a public meeting on Monday (29 June), Detroit police chief James Craig acknowledged the flaws in the department’s facial recognition software, adding that nearly every case would go unsolved if police relied solely on the technology to identify suspects.

“If we would use the software only [to identify subjects], we would not solve the case 95-97 per cent of the time,” Craig said. “That’s if we relied totally on the software, which would be against our current policy… If we were just to use the technology by itself, to identify someone, I would say 96 per cent of the time it would misidentify.”

Craig’s remarks come after the American Civil Liberties Union (ACLU) filed a complaint with the Detroit Board of Police Commissioners seeking a public apology over the arrest of Robert Williams, a black Michigander who was arrested in January after reportedly being falsely identified by the system used by the Detroit Police Department (DPD). 

According to the complaint, DPD’s system erroneously identified Williams as a shoplifter who’d stolen five watches worth $3,800 (£3,060) from a luxury retail store a year and a half earlier. 

Whilst in custody, William told police the person on the CCTV footage did not look like him, The New York Times reported, with the complaint by ACLU saying: “The investigating officer looked confused, told Mr Williams that the computer said it was him but then acknowledged that ‘the computer must have gotten it wrong’.” 

The city of Detroit uses software developed by a company named DataWorks Plus, which said that facial-recognition tech isn’t intended as the sole way of identifying people.

The system doesn’t “bring back a single candidate,” DataWorks Plus general manager Todd Pastorini told Vice, likening the software to automated fingerprint identification systems, where dozens or hundreds of potential matches are returned. “It’s hundreds. They are weighted just like a fingerprint system based on the probe [and what’s in the database].”

According to Detroit’s own police officers, they are ultimately making the decision to question and investigate people based on what the software returns and a detective’s judgment. Craig, however, noted police officers in Detroit are not allowed to arrest someone based solely on the results of a facial recognition search. And the Detroit police claimed that they didn’t do that in the Williams case. 

Over the course of this year (to 22 June), the technology has been used 70 times, according to publicly released data by the DPD. In 68 of those cases, the photo fed into the software was that of a black person. In the remaining two cases, the race was listed as ‘U,’ which likely means unidentified based off other police reports published. The photos were largely pulled from social media (31 of 70 cases), or a security camera (18 of 70 cases).

Several cities across the US have banned police from using facial recognition software. Detroit, however, instead decided to regulate its use rather than ban it altogether following a debate about the technology last year. 

Late last year, the city adopted a policy which bans the use of facial recognition to “surveil the public through any camera or video device,” bans its use on livestream and recorded videos, and restricts (but does not ban) its use at protests. As part of these regulations, the DPD is also required to release weekly reports about the use of the technology, which show that it has been almost exclusively used on black people.

Williams was arrested before the policy went into practice. Craig said during the meeting that the media it ran through DataWorks’ facial recognition system was “a horrible video”. “It was grainy… it would have never made it under the new policy,” he explained. “If we can’t obtain a good picture, we’re not going to push it through to the detective.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles

Info Message

We use cookies to give you the best online experience. Please let us know if you agree to all of these cookies.


Learn more about IET cookies and how to control them