Artificial eyes could be left for years to monitor war zones and outer space
Image credit: Dreamstime
London researchers are beginning a project to explore how artificial eyes - ‘silicon retinas’ - could be made to function as effectively as the real thing. This could help develop systems to monitor dangerous environments without the need for human intervention.
The Internet of Things (IoT) is an emerging network of interconnected electronic devices and smart household objects which can be operated remotely. A fully-fleged IoT could transform how we live, but this would put enormous demands on the capabilities of our technology.
For instance, monitoring live video feeds from hospitals, homes or roads on our devices could be useful, but transmitting raw footage is inefficient in terms of energy consumption and reaction times. We would find ourselves constantly draining our devices’ batteries and may lose fine detail.
This three-year project, led by Kingston University, seeks to explore how an artificial vision system could become as efficient as a human eye. Such an efficient system would useful in the IoT and near-future robotics.
The researchers will base their work on newly developed dynamic visual sensors, or silicon retinas, which are designed around the mammalian eye. Mammalian eyes, which do not use pixels and frames – are far more efficient than their artificial counterparts. Silicon retinas reduce memory requirements by only updating the parts of an image being captured when movement (a change in light) is detected.
This reduces power consumption by a factor of 10 compared with typical visual systems. The silicon retina captures a variety of frame rates in a scene; up to 2,000 frames per second when necessary.
“Conventional camera technology captures video in a series of separate frames, or images, which can be a waste of resources if there is more motion in some areas than others,” said Professor Maria Martini, who is leading the Kingston University team in the Internet of Silicon Retinas project.
“Where you have a really dynamic scene, like an explosion, you end up with fast-moving sections not being captured accurately due to frame-rate and processing power restrictions and too much data being used to represent areas that remain static.”
The researchers will study how detailed footage could be sourced most efficiently from the silicon retinas and then shared between machines, or uploaded to the cloud. They will also consider how the sensors could work as a component of the IoT.
“This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants,” she said. “They could be implemented in small devices where people can’t go and it’s not possible to recharge the battery.”
Professor Martini added that the research could have wide-ranging applications for the use of sensors in other fields. Sensors, she suggests, could be thrown from a plane into a forest and left there for years without any need for human intervention. They could also be particularly useful for collecting footage from dangerous places, such as war zones or even other planets.