Autonomous underwater robots that can make decisions in real time based on their surroundings could unlock the key to deep-sea exploration.
Scientists have long used robots that can travel unguided in underwater environments when exploring the oceans.
But the data collected by them takes time to analyse and interpret, meaning that the ability to use the information in real time is lost.
A team from the University of Delaware has been experimenting with programming robots to allow them to change their behaviour in response to what is going on around them.
They ran a test experiment after code alterations to a modular deep-sea robot called REMUS600 to see whether new data could trigger different behaviours based on biological information, such as the appearance of squid at a certain size or concentration.
"We knew the vehicle had more capabilities than we previously had applied," said Mark Moline, director of the School of Marine Science at Delaware University.
Creating artificial intelligence that can respond to the dynamic ocean environment proved challenging. Varying currents cause the marine organisms to be in constant motion, resulting in mass migration, swimming and behavioural changes that can be difficult to predict.
"What you see at any given instance is going to change a moment later," Moline said.
The researchers pre-programmed the computers on-board the REMUS to make certain decisions. While surveying the ocean 500 to 900 metres below the surface, the on-board computers were analysing the sonar data of marine organisms in the water based on size and density.
When acoustic sensors aboard the vehicle detected the right size and concentration of squid, it triggered a second mission to report the robot's position in the water and then run a pre-programmed grid to map the area in finer detail.
The higher-level scan revealed a very concentrated collection of squid in one area and a second less tightly woven mass of similarly sized squid as the scan moved north to south.
According to Moline, these are details that might have been missed if the REMUS was only programmed to keep travelling on a straight line.
"It was a really simple test that demonstrated that it's possible to use acoustics to find a species, to have an AUV target specific sizes of that species, and to follow the species, all without having to retrieve and reprogram the vehicle to hunt for something that will probably be long gone by the time you are ready," he said.
Combining available robotics technologies to explore the water in this way can help fill information gaps and may illuminate scales of prey distribution that scientists don't know exist.
Meanwhile, Boston Dynamics, the robot developer owned by Google, has unveiled the latest version of its Atlas robot, a humanoid machine that is designed to function in a variety of scenarios.
In a video, the developers demonstrate how the robot could traverse rough terrain, including snow and gravel, and navigate the interior of buildings.
The robot can even open doors for itself and withstand ‘bullying’ from its developers, as shown when one of the team pushed the robot to the floor and it lifted itself back up again.
Atlas is electrically powered and hydraulically actuated to maximise its range of movement and allow it to move and carry objects.
It uses sensors in its body and legs to balance itself and LIDAR and stereo sensors in its head to avoid obstacles, assess the terrain, help with navigation and manipulate objects.
An update on an older model, the new version of Atlas stands approximately 175cm tall and weighs 80kg.