AI voice assistants could negatively impact child development, research finds
Image credit: Dreamstime
Voice control assistant devices from the likes of Google and Amazon could have negative “long-term consequences on empathy, compassion and critical thinking” for children, researchers have said.
A team from the University of Cambridge said that children can view social robots as beings with their own rights and feelings, despite being aware of the machine status.
This is exacerbated by the way they interact with the devices through the use of wake commands such as “Hey Google” or “Hi Alexa”, which increases the risk that children over-anthropomorphise digital devices.
However, there is no expectation that polite terms, such as “please” or “thank you”, should be used during these interactions and there is no need to consider the tone of voice and whether the command being issued may be interpreted as rude or obnoxious.
The inability of smart speakers to offer children constructive feedback if their speech is considered rude or inappropriate could normalise negative social interactions during child development, the researchers suggested.
They pointed out that Amazon has made some initial moves to counter this through the use of its optional ‘Magic Word’ function on Alexa devices whereby it acknowledges the use of polite mannerisms – for example, by responding to the child with “Thanks for asking so nicely”.
The research suggested that with social development already impaired due to Covid-19 stay-at-home restrictions, the impact may have been particularly negative in recent years.
“The rise of voice devices has provided great benefit to the population,” the paper states.
“Their abilities to provide information rapidly, assist with daily activities and act as a social companion to lonely adults are both important and useful. However, urgent research is required into the long-term consequences for children interacting with such devices.
“Interacting with the devices at a crucial stage in social and emotional development might have long-term consequences on empathy, compassion and critical thinking.”
In 2019, another team developed a gender-neutral voice for AI assistants in an attempt to avoid reinforcing sexist stereotypes in the typically female-fronted services.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.