jogger man running

‘Earables’ the next milestone in wearable tech, say engineers

Image credit: Dreamstime

A team of engineers based at the University of Illinois Grainger College of Engineering are exploring a new sub-area of mobile technology based on sophisticated and connected earphones.

“The leap from today’s earphones to 'earables' would mimic the transformation that we had seen from basic phones to smartphones,” explained Professor Romit Roy Choudhury, an expert in electrical and computer engineering. “Today’s smartphones are hardly a calling device anymore, much like how tomorrow’s earables will hardly be a smartphone accessory.”

Choudhury and his colleagues – who are drawing from a wide range of fields – are developing new algorithms and experimenting with them on earphone platforms with live users. They hope that in the future, these wearable devices will continuously sense human behaviour; run acoustic augmented reality; have digital assistants whisper information, and track user health and fitness.

The researchers have published a series of papers in this area, including a recent paper on voice localisation for earphones.

“If you want to find a store in a mall, the earphone could estimate the relative location of the store and play a 3D voice that simply says 'Follow me'. In your ears, the sound would appear to come from the direction in which you should walk, as if it’s a voice escort,” said PhD student Zhijian Yang.

'EarSense: Earphones as a Teeth Activity Sensor' considers how earphones could sense facial and in-mouth activities such as teeth movements and taps, enabling a form of hands-free communication to smartphones and other computing devices. Moreover, as various medical conditions manifest in the chattering of teeth, a smart earphone could make it possible to identify them. The team is planning to look into whether sensors incorporated into earphones could be used to analyse facial muscle movements and thus emotions.

A third paper, 'Voice Localisation Using Nearby Wall Reflections', explores the use of algorithms to detect the direction of a sound; this would allow a wearer’s earphones to tune into the direction of a person who is speaking to them.

“We’ve been working on mobile sensing and computing for 10 years,” said PhD student Yu-Lin Wei. “We have a lot of experience to define this emerging landscape of earable computing.”

Speaking to E&T recently, Anders Andréen, CEO of Swedish audio brand Urbanista, said that sensors were likely to play a role in making consumer audio products more helpful in the future; for instance, in adjusting active noise control to the wearer’s environment.

“We’ll go into exciting times with sensors, with gyros,” he said. “I think there’s plenty to dive into. We’ll go from the old phones into smart hearables and that will be interesting.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles