Welcome Your IET account
Man sleeping rough outside some gates

Homeless and ethnic minorities targeted for Google facial data collection

Image credit: Dreamstime

According to a New York Daily News report, an agency tasked with gathering facial data for Google to train its facial-recognition software used dubious methods to collect this data.

In July, it was reported that Google was planning to switch its fingerprint scanner for a facial-recognition unlocking feature – similar to that already used by Apple and Huawei phones – in its upcoming Pixel 4 range of smartphones. Google offered $5 gift tokens to people recruited in US cities in exchange for facial scans to refine the feature.

Sources who were involved in the project told the New York Daily News that Google contractor Randstad had sent teams of researchers to cities (including in Georgia and California), explicitly telling them to target homeless people and people with dark skin. They were told not to mention that they were working for Google or that they were recording facial data.

One source claimed that they had been told to target homeless people as they were least likely to work out what was happening and speak to the media, while another said that: “I feel like they wanted us to prey on the weak”. Homeless people were allegedly reminded that they could – under a California state law – exchange their $5 gift token for cash.

The recommended technique reportedly involved approaching people and asking them to take part in a “selfie game” or “survey” or to “just play with the phone for a couple of minutes” to try out a new app – while the hardware scanned their face – or pressuring them to approve a consent form without reading it.

A source alleged that a Google manager had directed the researchers to target people with dark skin. Commercial facial-recognition systems have been shown to fail groups of people underrepresented in datasets, including women and people with dark skin. The comparatively poor performance of facial-recognition software when identifying ethnic minorities is likely to have motivated this targeting.

Jake Snow, an ACLU tech and civil liberties attorney, wrote on Twitter: “This is totally unacceptable conduct from a Google contractor. It’s why the way AI is built today needs to change. The answer to algorithmic bias is not to target the most vulnerable.”

A Google spokesperson stated: “We’re taking these claims seriously and investigating them. The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided.”

The spokesperson added that the collection of facial data must involve a diverse range of people in order to build an “inclusive” feature.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles

Info Message

We use cookies to give you the best online experience. Please let us know if you agree to all of these cookies.


Learn more about IET cookies and how to control them