An algorithm mimicking the ability of insects to efficiently track objects in complex environments will help improve the visual ability of robots.
Developed by researchers from the University of Adelaide, the algorithm is designed to reduce distractions from the background and works up to 20 times faster than current state-of-the-art tracking algorithms.
"This type of performance can allow for real-time applications using quite simple processors," said Steven Wiederman, a neuroscientist at the University of Adelaide, who was inspired to develop the technology after studying the responses of neurons to visual stimuli in the brain of a dragonfly.
Despite its minuscule brain and limited visual acuity, a dragonfly can chase its prey at an astounding speed of 60 km/h. Moreover, the insect is successful in catching the target in 97 per cent of cases.
The researchers have already tested the algorithm in virtual settings and are now planning to test it on a bio-inspired autonomous robot in order to prove the benefits.
“Instead of just trying to keep the target perfectly centred on its field of view, our system locks on to the background and lets the target move against it," explained PhD student Zahra Bagheri, the lead author of a paper published in the Journal of The Royal Society Interface, describing the invention.
"This reduces distractions from the background and gives time for underlying brain-like motion processing to work. It then makes small movements of its gaze and rotates towards the target to keep the target roughly frontal."
A robot equipped with the algorithm would be able to actively pursue an object without giving in to distractions.
"Detecting and tracking small objects against complex backgrounds is a highly challenging task," Bagheri said.
"Consider a cricket or baseball player trying to take a match-winning catch in the outfield. They have seconds or less to spot the ball, track it and predict its path as it comes down against the brightly coloured backdrop of excited fans in the crowd - all while running or even diving towards the point where they predict it will fall.”
Robots have some way still to go before they can combine sharp eyes with quick reflexes and flexible muscles. However, the insect-inspired algorithm could bring them one step closer.