
Brain-machine interface with VR turns intentions into actions
Image credit: Pop Nukoonrat/Dreamstime
Researchers have developed a wearable brain-machine interface (BMI) system that they believe could improve the quality of life for people with motor dysfunction or paralysis.
An international research team – led by the lab of Woon-Hong Yeo at the Georgia Institute of Technology (Georgia Tech) – combined wireless soft-scalp electronics and virtual reality (VR) in a BMI system that allows the user to imagine an action and wirelessly control a wheelchair or robotic arm.
The technology could even help those struggling with locked-in syndrome – when a person is fully conscious but unable to move or communicate.
“The major advantage of this system to the user, compared to what currently exists, is that it is comfortable to wear, and doesn’t have any wires,” said Yeo, associate professor Georgia Tech’s School of Mechanical Engineering.
BMI systems are a rehabilitation technology that analyses a person’s brain signals and translates that neural activity into commands, turning intentions into actions. The most common non-invasive method for acquiring those signals is ElectroEncephaloGraphy, or EEG, which typically requires a cumbersome electrode skull cap and a tangled web of wires.
Such devices, however, rely heavily on gels and pastes to help maintain skin contact, require extensive set-up times, are inconvenient and uncomfortable to use. The devices also often suffer from poor signal acquisition because of material degradation or motion artefacts – the ancillary 'noise', which may be caused by something like teeth grinding or eye blinking. This noise shows up in brain-data and must be filtered out.
To tackle this, Yeo designed a portable EEG system that integrates imperceptible microneedle electrodes with soft wireless circuits – this offers improved signal acquisition. The team also integrated a machine-learning algorithm and a VR component to help measure those brain signals to determine what actions a user wants to perform.
According to the researchers, the new system has been tested with four human subjects but is yet to be studied on disabled participants. “This is just a first demonstration, but we’re thrilled with what we have seen,” noted Yeo.
Yeo’s team originally introduced a soft, wearable EEG brain-machine interface in a 2019 study. “This new brain-machine interface uses an entirely different paradigm, involving imagined motor actions, such as grasping with either hand, which frees the subject from having to look at too much stimuli,” said Musa Mahmood, a PhD student in Yeo’s lab.
In their latest study, participants showed accurate control of virtual-reality exercises using their thoughts – their motor imagery. The visual cues enhance the process for both the user and the researchers gathering information.
“The virtual prompts have proven to be very helpful,” Yeo said. “They speed up and improve user engagement and accuracy. And we could record continuous, high-quality motor-imagery activity.”
According to Mahmood, future work on the system will focus on optimising electrode placement and more advanced integration of stimulus-based EEG, using what they’ve learned from the last two studies.
The team, which included researchers from the University of Kent, UK, and Yonsei University, South Korea, describes the new motor-imagery-based BMI system in this month's Advanced Science.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.