E&T

AI systems learn to be sexist and racist from the internet

A study published in Science has found that machine learning programmes can acquire culturally embedded prejudices, varying from a fondness for flowers to discrimination against women.

Most people take for granted that computers are entirely cool, logical and objective. However, when machines gain the ability to learn based on human input, they may end up reflecting our cultural preferences and prejudices. A group of researchers based at Princeton University set about testing the extent to which computers pick up irrational preferences.

The researchers based their study on the Implicit Association Test (IAT). The IAT, developed in the 1980s, is widely used in social psychology studies to reveal cultural or personal bias. By asking a person to pair concepts they may find similar or dissimilar and measuring their response time, the IAT measures the strength of association between the concepts. For instance, we would expect to be able to pair “suffering” with “bad” more quickly than with “good”.

To extend the IAT to test non-human biases, the Princeton University team used GloVe; a machine learning version of the Implicit Association Test. The programme measures the the co-occurrence of words in small windows of text and words which often appear close together are judged to have a stronger association.

This programme was used to process a huge sample of human culture: 840 billion words of web content.

Using this sample, Dr Narayanan and his colleagues examined sets of target words, such as “programmer, engineer, scientist”, and “nurse, teacher, librarian”, alongside sets of attributes such as “man, male” or “woman, female”.

The researchers uncovered some innocent preferences, such as an association of flowers and instruments with pleasantness, but also showed evidence of gender and racial discrimination. For instance, GloVe came to associate female names with terms such as “wedding” and “parents”, while associating male names with terms such as “salary” and “professional”. These results replicates the biases found in human IAT results and reflect real racial and gender disparity.

“We have a situation where these artificial intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from,” said Dr Arvind Narayanan, an assistant professor of computer science at Princeton University, who led the study.

Computers learning prejudice from humans could cause trouble by reinforcing these biases. This is likely to become more problematic as we rely more on natural language processing, for instance, in automatic translation or online text searches.

For instance, when foreign languages are processed by machine learning programmes, results can be influenced by gender stereotypes. The Turkish language uses a gender neutral third-person pronoun “o”, which Google Translate adapts depending on the context of the sentence: “o bir doktor” is translated into “he is a doctor”, while “o bir hemşire” is translated into “she is a nurse”.

“The biases that we studied in the paper are easy to overlook when designers are creating systems,” said Dr Narayanan. “The biases and stereotypes in our society reflected in our language are complex and longstanding. Rather than trying to sanitise or eliminate them, we should treat biases as part of the language and establish an explicit way in machine learning of determining what we consider acceptable and unacceptable.”

While some researchers are working to prevent computers from picking up irrational human habits and biases, some argue that there are advantages to less-than-logical cognition. A recent study published in Human Factors has suggested that automated systems could benefit from developing a form of “gut feeling” – intuitive cognition – to be used alongside logical deliberation. The paper argues that intuitive cognition allows for faster responses and could be useful in smart cars, homes and devices.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles