Radar sensor system to help blind people identify objects

St Andrews University researchers have developed a novel radar-based system that can recognise various objects, including body parts, makes of smartphones and food.

Dubbed the RadarCat (for radar categorisation), the system could, for example, help blind users distinguish between two identical bottles with different liquids inside.

In a video demonstration created by the St Andrews team, the system consists of a radar sensor connected to a laptop which is shown to promptly identify various objects, including a CD, a book, a mousepad and the exact make of a smartphone.

The researchers used a sensor developed by Google as part of the company’s Project Soli. The sensor’s original purpose was to detect the smallest motion of human fingers, but the St Andrews team created a completely new application and software. The team expects Google to eventually start installing Soli in its devices, which would make it easy to commercialise the RadarCat.

“The Soli miniature radar opens up a wide range of new forms of touchless interaction,” said Professor Aaron Quigley, Chair of Human Computer Interaction at St Andrews.

“Once Soli is deployed in products, our RadarCat solution can revolutionise how people interact with a computer, using everyday objects that can be found in the office or home, for new applications and novel types of interaction.”

The system can be connected to further apps providing additional information about the recognised objects. For example, placing some type of food onto the sensor could bring up detailed nutritional data on the computer or smartphone screen. Similarly, it could provide the technical specifications of electronic devices when the user is browsing in shops.

In restaurants, a waiter could be alerted by the sensor to the fact that a customer has just finished his or her drink and might need a refill.

In the video, the team showed the system can even recognise different body parts and this information subsequently passed on to various health apps.

“Our future work will explore object and wearable interaction, new features and fewer sample points to explore the limits of object discrimination,” Professor Quigley said.

“Beyond human computer interaction, we can also envisage a wide range of potential applications ranging from navigation and world knowledge to industrial or laboratory process control.”

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them