Robots taught to navigate complex sea currents with algorithms
Image credit: Dreamstime
Researchers are developing algorithms that teach robots to adapt to the constantly changing dynamics of the sea which will allow them to protect and preserve aging water-rooted infrastructure, such as piers, pipelines, bridges and dams.
The constantly fluctuating environment created by waves, winds, currents, wakes from passing boats and eddies swirling around structures make water one of the most complex environments for experienced boat captains, let alone robots.
There are typically far more underwater structures than there are divers to inspect them with desirable frequency. Sometimes, they must dive below the surface to extreme and dangerous depths, requiring several weeks to recover.
Mechanical engineering professor Brendan Englot at Stevens Institute of Technology said: “There are so many difficult disturbances pushing the robot around and there is often very poor visibility, making it hard to give a vehicle underwater the same situational awareness that a person would have just walking around on the ground or being up in the air.”
His research group has employed a type of artificial intelligence known as reinforcement learning which uses algorithms that are not based on an exact mathematical model but are goal-oriented instead to teach robots how to carry out a complex objective by performing actions and observing the results.
As the robot collects data, it updates its ‘policy’ to figure out optimal ways to manoeuvre and navigate underwater.
The data they are collecting is sonar, the most reliable tool for navigating undersea. Like a dolphin using echolocation, Englot’s robots send out high-frequency chirps and measure how long it takes the sound to return after bouncing off surrounding structures - collecting data and gaining situational awareness all while being knocked around by any number of forces.
Englot recently sent a robot on an autonomous mission to map a Manhattan pier. “We didn’t have a prior model of that pier,” Englot said. “We were able to just send our robot down and it was able to come back and successfully locate itself throughout the whole mission.”
Guided by the algorithms the robot moved independently, gathering information to produce a 3D map showing the location of the pier’s pilings.
Ultimately the technology could be used to allow robots to carry out routine inspections on everything from ship hulls to off-shore oil platforms. In addition, robots can map the Earth’s vast, underwater terrain.
However, achieving these goals means addressing sonar’s limitations. “Imagine walking through a building and navigating the hallways with the same gray-scale, grainy visual resolution as a medical ultrasound,” Englot said.
Once a structure has been mapped an autonomous robot could plan a second pass, a higher resolution inspection of critical areas using a camera. Englot further imagines eel-like robots that can weave through crevices and narrow spaces, maybe even assisting in rescues.
“To really take advantage of those kinds of designs first we need to be able to navigate with confidence,” he said.