The m(eye)DAQ system.

Loughborough student designs blink-to-speech system in a week

Mechanical engineering student Robert Green created a low cost digital letter board system giving paralysis sufferers a voice.

How it works

The digital letter board, called the m(eye)DAQ, helps people with paralysis communicate using a blink and finger detection system to create words and sentences which are then read aloud. 

Created from low cost, easy to come by apparatus, the blink detection system is made up of a pair of 3D cinema glasses and a reverse parking sensor, which is used as an optical sensor. The finger movement detection system is a single switch created using a doorbell, which senses finger taps.

The software interprets data to detect whether a blink or finger tap has taken place and the user can scroll up and down the screen to select words or letters from the digital letter board to create sentences, which are then read out loud by the computer. This is aided by a predictive text function.

Cutting the cost of communication

Blink-to-speech devices currently on the market cost thousands of pounds but Robert wanted to produce a novel method of communication that would cost as little as possible.

“As someone who has witnessed first-hand the frustration that comes for sufferers of severe paralysis, I wanted to produce a tool that had the potential to significantly improve their quality of life and remove the need for an interpreter,” Robert explains.  “No-one should feel that other people have to adapt themselves to understand you. A significant amount of upset and frustration comes with this. As such, giving anyone who doesn’t have a voice the opportunity to speak for themselves opens up a whole new lease of life.
 
“There are a number of systems out there and many are based on different technologies such as the tracking of eyes through cameras and many of them cost thousands of pounds,” he continues. “With my open source software and low cost components, my system comes in at well under £200.”

Freedom of speech in a week

Robert developed his idea while working as an applications engineer intern at National Instruments. Here he had the chance to work in-depth with high accuracy data acquisition and gather vast amounts of programming knowledge. 

“The project gave me the opportunity to develop my skills in various aspects of engineering, for example, designing and producing the circuit for the sensor and subsequently integrating this into the device.

“I spent a significant amount of time developing my understanding of data acquisition and interpretation of signals,” he goes on. “This extensive training and experience gave me the knowledge and tools that I required to develop such a project and allowed me to take my engineering concepts and bring them to fruition.

“The project itself only took a week to take from concept to a full working prototype including all of the back-end software. This is a testament to not only the engineering knowledge that I received at Loughborough University but also the power of the tools from National Instruments.” 

Plans for the product

Robert has been amazed by the positive reception that the project has received and in the future he would like to add the ability to send social media updates using the device, giving even more independence to the user.

“I hope to take it further by incorporating even more means of communication, such as email and text messaging as well as improving all of the underlying detection and prediction algorithms. If the project has the potential to enhance even a single person’s life then I will consider it a success.”

Future projects

With the skills he has learnt creating the m(eye)DAQ Robert would like to continue to find ways of helping people. 

“My utmost ambition is to continue to work at the intersection of healthcare and engineering. The knowledge and skills that I have obtained can have a significant impact on people’s lives,” he concludes.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close