Welcome Your IET account
Android robot thinking

Sounds could improve a robot’s perception

Image credit: Kittipong Jirasukhanont/Dreamstime

Researchers in the US have found that robot perception could be improved significantly by adding the sense of hearing.

In what they say is the first large-scale study of the interactions between sound and robotic action, researchers at Carnegie Mellon University’s (CMU) Robotics Institute found that sounds could help a robot differentiate between objects, such as a metal screwdriver and a metal wrench.

According to the researchers, hearing could also help robots determine what type of action caused a sound and help them use sounds to predict the physical properties of new objects. “A lot of preliminary work in other fields indicated that sound could be useful, but it wasn’t clear how useful it would be in robotics,” said Lerrel Pinto, who worked on the study. 

Pinto and his colleagues found the performance rate was high, with robots that used sound successfully classifying objects 76 per cent of the time. The results were so encouraging, he added, that it might prove useful to equip future robots with instrumented canes, enabling them to tap on objects they want to identify.

As part of the study, the researchers created a large dataset, simultaneously recording video and audio of 60 common objects – such as toy blocks, hand tools, shoes, apples, and tennis balls – as they slid or rolled around a tray and crashed into its sides. Since then, they have released this dataset cataloging 15,000 interactions, for use by other researchers.

The team captured these interactions using an experimental apparatus they called Tilt-Bot – a square tray attached to the arm of a Sawyer robot. They could place an object in the tray and let Sawyer spend a few hours moving the tray in random directions with varying levels of tilt as cameras and microphones recorded each action.

According to the researchers, this process was an efficient way to build a large dataset. They also collected some data beyond the tray, using Sawyer to push objects on a surface.

Though the size of this dataset is unprecedented, other researchers have also studied how intelligent agents can glean information from sound. For instance, Oliver Kroemer, assistant professor of robotics, led research into using sound to estimate the amount of granular materials, such as rice or pasta, by shaking a container or estimating the flow of those materials from a scoop.

Researchers devised an apparatus called Tilt-Bot to build a collection of actions, video and sound to improve robot perception. Objects were placed in a tray attached to a robot arm, which then moved the tray randomly while recording video and audio.

For the study, objects were placed in a tray attached to a robot arm, which then moved the tray randomly while recording video and audio.

Image credit: Carnegie Mellon University

Pinto said the usefulness of sound for robots was therefore not surprising, though he and the others were surprised at just how useful it proved to be. They found, for instance, that a robot could use what it learned about the sound of one set of objects to make predictions about the physical properties of previously unseen objects.

“I think what was really exciting was that when it failed, it would fail on things you expect it to fail on,” he said. For instance, a robot couldn't use sound to tell the difference between a red block or a green block. “But if it was a different object, such as a block versus a cup, it could figure that out.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles

Info Message

We use cookies to give you the best online experience. Please let us know if you agree to all of these cookies.


Learn more about IET cookies and how to control them