robot-hearing

Robots to 'hear better' with new microphone technique

A new microphone system that could help future robots pay closer attention to what they hear has been developed by UK scientists.

The group of researchers from Imperial College London created a technology that makes it possible to zoom in on conversations in noisy rooms, just as people have to do at a party, or from afar. This could help robots listen and take instructions, for instance in hospitals, according to the scientists.

Lead researcher Patrick Naylor said: “Being able to pick out particular conversations or voices in a crowd is a real challenge for everyday devices like phones and hearing aids.

“Humans have an extraordinary ability to tune in to particular sounds, picking voices out of a noisy environment. Complex processing in the brain learns to ignore unwanted sounds like traffic or other chatter in order to pick out the important sounds we want to hear.

“Artificial intelligence isn't as smart as that. At the moment robots, phones and other devices using speech recognition don't work well when you're not close to the microphone or in a quiet space because there's just too much noise.

“Until now, microphones haven't been able to separate one sound or conversation from another in 3D.”

The new system, being demonstrated at the Royal Society's Summer Science Exhibition which opens to the public in London tomorrow, uses 32 microphones placed around a sphere. Tiny differences in the time it takes sound to reach each of the microphones are measured and the information analysed by a computer.

In this way, the system can tune into sounds coming from one particular spot, and by creating an acoustic zoom it can also listen to a sound source far away and hear it as if it were much closer. The team hope to give robots the same level of awareness and understanding of the soundscape as humans do, but in 3D.

Dr Naylor added: “Being able to listen selectively, focusing on one person, is vital for human communication. Until artificial intelligence can listen to different parts of the soundscape going on around it and pick out important conversations, AI will never properly be able to interact and converse with humans in noisy real-world situations.

“It's a big challenge to be able to take away all the extra noises completely, but this is a first step towards that.”

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close