App uses voice-based AI to track wellbeing of mental health patients
Image credit: Axel Bueckert - Dreamstime
Researchers in the US have found that an interactive voice application, which uses artificial intelligence, is an effective way to monitor the wellbeing of patients being treated for serious mental illness.
A University of California, Los Angeles (UCLA) study followed 47 patients for up to 14 months using an application called MyCoachConnect. All of the patients taking part in the study were being treated by physicians for some serious mental health illnesses, including bipolar disorder, schizophrenia and major depressive disorder.
For the study, participants called a toll-free number one or two times a week and answered three open-ended questions when prompted by a computer-generated voice. These included: "How have you been over the past few days?", "What’s been troubling or challenging you over the past few days?", and "What’s been particularly good or positive?".
The MyCoachConnect app was designed to collect personalised patient responses, explained Dr Armen Arevian, director of the Innovation Lab at UCLA's Jane and Terry Semel Institute for Neuroscience and Human Behavior. More specifically, the AI was trained to use an individual’s own words to provide them with a personalised analysis.
The application focused primarily on the choice of words the patients used in their responses and how their responses change over time, with less emphasis placed on audible factors such as tone of voice.
The analysis of the data, conducted in collaboration with researchers from the Signal Analysis & Interpretation Laboratory (SAIL) at the University of Southern California (USC), found that the application’s analysis was as accurate at monitoring patients’ mental states as their treating physicians.
“The way people answer questions and the way they change their answers over time is unique to each patient,” Arevian said. “We were looking at a person as a person and not as a diagnosis.”
For the study, the patients made calls from a mobile phone, landline, or payphone and were asked to speak for two to three minutes for each question.
“Technology doesn’t have to be complicated,” Arevian added. “In this study, patients didn’t need a smartphone or even a phone at all. It could be simple and low tech on the patient end and high tech on the backend.”
The team hope that AI that can analyse data collected from apps such as MyCoachConnect will enable more proactive and personalised care. For example, the app may help improve treatment by intervening early when someone is experiencing more symptoms.
“Artificial intelligence allowed us to illuminate the various clinically meaningful dimensions of language use and vocal patterns of the patients over time and personalised at each individual level,” said Dr Shri Narayanan, the director of SAIL at the USC Viterbi School of Engineering.
Arevian added that some of the participants interviewed after the study ended, and said they found the system easy and enjoyable to use.
“They said speaking to a computer-generated voice allowed them to speak more freely,” he said. “It also helped them feel less lonely because they knew that someone would be listening to it, and to them, that meant that someone cared.”
MyCoachConnect was developed and hosted on the Chorus platform, which was developed by Arevian at UCLA and allows people to visually create mobile and other computer applications without computer programming in as little as a few minutes.
Last November, a team of US researchers suggested that machine learning systems could assist doctors in monitoring the mental health of patients through speech-based tests.
Also, back in July 2019, computer scientists at the University of Alberta developed AI algorithms that can detect and identify depression through vocal cues.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.