Brain-machine interface capable of learning

13 May 2014
By Tereza Pultarova
Mobile version
Share |
Systems translating thoughts into commands could substantially improve the quality of life of people with severe disabilities

Systems translating thoughts into commands could substantially improve the quality of life of people with severe disabilities

The system uses brain signal detecting electrodes to understand the users' thoughts

The system uses brain signal detecting electrodes to understand the users' thoughts

Focusing mental energy to convey  thoughts to machines could be tiring

Focusing mental energy to convey  thoughts to machines could be tiring 

The first brain-machine interface system capable of learning commands has been developed in Japan.

The system, designed to help people with severe motion or speaking disabilities, is the first of its kind addressing the excessive mental load existing systems place on a user. Every time the user wants to perform even a simple action, he or she has to focus their mental energy to deliver the message, which could be very tiring.

“We give learning capabilities to the system by implementing intelligent algorithms, which gradually learn user preferences,” said Christian Isaac Peñaloza Sanchez, a PhD candidate at the University of Osaka, Japan.

“At one point it can take control of the devices without the person having to concentrate much to achieve this goal," he said.

For the past three years, Peñaloza Sanchez has been developing the system which uses electrodes attached to the person’s scalp to measure brain activity in the form of EEG signals. The signals show patterns related to various thoughts and the general mental state of the user as well as the level of concentration.

Currently, the system can learn up to 90 per cent of common instructions such as controlling a wheel chair and navigating it around a room.

After the system learns the command from the user, the action could be triggered either by pressing a button or by a quick thought. While performing the automated action, the system looks for the so-called error-related negativity signal – a reaction in a human brain when an incorrect response is initiated – for example if the system opens a window instead of turning on the TV.

"We've had pretty good results in various experiments with multiple people who have participated as volunteers in our in vivo trials,” said Peñaloza Sanchez.

“We found that user mental fatigue decreases significantly and the level of learning by the system increases substantially."

The system could be used to control a whole range of devices – robot prosthetics, computer pointers or house appliances. It also includes a graphical interface displaying the available devices or objects, which interprets EEG signals to assign user commands and control devices.

The system can also learn to predict the user’s responses to certain situations and initiate the action before the user even asks for it. 

"It collects data from wireless sensors, electrodes and user commands to learn a correlation between the environment of the room, the mental state of the person and its common activities” said Christian Peñaloza.

Latest Issue

E&T cover image 1607

"As the dust settles after the referendum result, we consider what happens next. We also look forward to an international summer of sport."

E&T jobs

More jobs ▶

Subscribe

Choose the way you would like to access the latest news and developments in your field.

Subscribe to E&T