Advance in brain and eye tracking to prevent crashes
An eye-tracking, brain monitoring experiment in progress. The infra-red camera is on the small black console on the desk in front of the main PC screen
Advances in capturing data on brain activity and eye movement could prevent car crashes and herald an end of the computer mouse.
The breakthrough by researchers at the University of Leicester, in collaboration with the University of Buenos Aires in Argentina, has overcome previous technical challenges to simultaneously monitoring eye movement and brain activity by developing novel signal processing techniques.
The team has managed to merge high-speed eye tracking that records eye movements in unprecedented detail using cutting-edge infrared cameras and high-density electroencephalograph (EEG) technology that measures electrical brain activity with millisecond precision through electrodes placed on the scalp.
Dr Matias Ison, who led the research, says: “Historically, eye-tracking and EEG have evolved as independent fields. We have managed to overcome the challenges that were standing in the way of integrating these technologies.
“This is already leading to a much better understanding of how the brain responds when the eyes are moving.”
The breakthrough could be the first step towards a system that combines brain and eye monitoring to automatically alert drivers who are showing signs of drowsiness, a major development considering that the Department for Transport estimates fatigue accounts for around 20 per cent of traffic accidents on the UK’s motorways.
The system would be built into the vehicle and connected unobtrusively to the driver, with the EEG looking out for brain signals that only occur in the early stages of sleepiness, while the eye tracker would reinforce this by looking for erratic gaze patterns symptomatic of someone starting to feel drowsy.
But Ison believes the potential applications for the research, which received funding from the Engineering and Physical Sciences Research Council (EPSRC), are much wider.
“Monitoring the alertness of drivers is just one of many potential applications for this work,” he says. “Building on the foundation provided by our EPSRC-funded project, we hope to see the first of these starting to become feasible within the next three to five years.”
The team believes the research could ultimately be built on to dispense with the need for computer game players to physically interact with any type of console, mouse or other hand-operated system.
Instead, eye movement and brain activity data would be collected and processed to indicate what action the player wants to take.
By distinguishing the tiny differences in various types of brain activity, the EEG would identify the precise action the player desires, while the eye movement data would show exactly where on the screen the player was looking when they had this thought.
This information could be combined to enable the correct action to occur. An unobtrusive headset would be all that would be required to capture the necessary data.
The research could also have implications for wheelchair users with no arm functionality by allowing them to move their wheelchairs simply through their eye movements.
These movements could be tracked and the corresponding brain activity analysed to identify when these indicate a desire to move in a certain direction. This would then automatically activate a steering and propulsion mechanism that would drive the wheelchair to that place.
Finally the work could also provide the basis for improved tests to diagnose dyslexia and other reading disorders.
Current tests revolve around a rapid succession of single words flashed onto a computer screen, with the resulting brain activity monitored by EEG.
The new technique could enable the person being tested to move their eyes and read longer passages of text in a natural way, making the tests much more realistic and revealing.
With the basic concept now demonstrated successfully, the team aim to continue their work and eventually develop software that, in real time, automatically monitors both eye movement and brain activity.
"Asimov's three laws of robotics debuted in a story set this year, in 2015. Will real robots be most like Robby, Terminator or the Synths?"
- Zero-emission engine begins testing
- Smartphone ‘eyes’ for blind developed in Google-funded project
- Renewable energy contribution hits record level
- Female engineers help set new Guinness World Record
- Graphene-based sensor breaks sensitivity limits
- Government on target to support engineering apprenticeships
- Test [06:22 pm 20/03/15]
- Test [06:20 pm 20/03/15]
- What to Specialise in Electronics Engineering?? [03:02 am 03/04/14]
- Britain to have just one remaining coal pit by the end of 2015 [01:11 am 03/04/14]
- LV Generator Star point earthing - UK [08:35 pm 02/04/14]