The algorithms behind Google Instant may be perpetuating harmful stereotypes of minorities

Google Instant may perpetuate stereotypes

The auto-complete function of Google Instant has been found to perpetuate racist, sexist and homophobic stereotypes.

Google Instant uses complex algorithms to guess what a user is searching for from the moment they begin to type into the website’s search field so as to help users find the information they are looking for quickly.

But a pair of researchers from Lancaster University have found that the auto-complete function has other unintended consequences when undertaking searches based on the terms ‘gay’ and ‘black’, as described in the academics’ paper in the current issue of journal Critical Discourse Studies.

When they asked ‘why do gay’, for instance, Google Instant auto-completed their query with ‘men have high voices’, ‘men get aids’, ‘men lisp’, ‘people exist’ and ‘talk funny’.

“Clearly these suggested questions appear because they are the sorts of questions that other people have typed into Google in the past with a relatively high frequency,” Baker and Potts write.

“It is also likely that once certain questions become particularly frequent, they will be clicked on more often (thus enhancing their popularity further) so they will continue to appear as auto-suggestions.”

“It seems as though humans may have already shaped the internet in their image, having taught stereotypes to search engines and even trained them to hastily present these as results of ‘top relevance’.”

The results of further auto-completed searches were then divided into categories to provide a detailed breakdown of how particular social groups – like Muslims, Jews, Christians, Asians and lesbians – appear to be associated with certain qualities.

“Interestingly, the control category ‘people’ produced proportionally the most negative questions, which tended to be concerned with why people engaged in hurtful behaviours,” they write.

“However, there were also relatively high proportions of negative evaluative questions for black people, gays and males.”

Baker and Potts point out that groups who “either constitute a minority or have been subject to oppression either now or in the past” seem to be most stereotyped.

The researchers say Google should introduce a way for users to object to offensive auto-completes, which currently does not exist, and that “Google should seriously consider removing any statements from auto-complete that are consistently flagged”.

While the authors do not suggest that seeing auto-completed questions like ‘why do black people … like fried chicken’ will cause people to “internalise stereotypes”, they are concerned that some may not realise that they are being presented with stereotypes and “reproduce them in other contexts”.

They also raise the issue that other users, who hold such stereotypes, “may feel that their attitudes are validated, because the questions appear on a reputable search page such as Google”.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them