Virtual reality control of robots over long distances enabled by new software
Robots can now be controlled remotely in virtual reality thanks to software developed by Brown University researchers.
The software connects a robot’s arms and grippers as well as its onboard cameras and sensors to off-the-shelf virtual reality hardware via the internet.
Using handheld controllers, users can control the position of the robot’s arms to perform intricate manipulation tasks just by moving their own arms.
They get a first-person view of the environment or can walk around the robot to survey the scene in the third person - whichever is easier for accomplishing the task at hand.
The data transferred between the robot and the virtual reality unit is compact enough to be sent over the internet with minimal lag, making it possible for users to guide robots from great distances.
“We think this could be useful in any situation where we need some deft manipulation to be done, but where people shouldn’t be,” said David Whitney, who co-led the development of the system.
“Three examples we were thinking of specifically were in defusing bombs, working inside a damaged nuclear facility or operating the robotic arm on the International Space Station.”
The clean-up team at Japan’s ill-fated Fukushima power plant, for example, have had difficulties retrieving the most hazardous nuclear waste from the site as the robots are damaged beyond usability when they get too close.
Greater awareness of their surroundings through VR could help to more accurately pinpoint the location of the nuclear material and aid in its removal.
Even highly sophisticated robots are often remotely controlled using fairly primitive means - often a keyboard or something like a video game controller and a two-dimensional monitor.
That works fine, Whitney and Rosen say, for tasks like driving a wheeled robot around or flying a drone, but can be problematic for more complex tasks.
“For things like operating a robotic arm with lots of degrees of freedom, keyboards and game controllers just aren’t very intuitive,” Whitney said.
In addition, mapping three-dimensional environments onto two-dimensional screens limits the user’s perception of the space the robot inhabits, a problem that VR avoids.
Their software links together a Baxter research robot with an HTC Vive, a virtual reality system that comes with hand controllers.
The software uses the robot’s sensors to create a point-cloud model of the robot itself and its surroundings, which is transmitted to a remote computer connected to the Vive. Users can see that space in the headset and virtually walk around inside it. At the same time, users see live high-definition video from the robot’s wrist cameras for detailed views of manipulation tasks to be performed.
For their study, the researchers showed that they could create an immersive experience for users while keeping the data load small enough that it could be carried over the internet without a distracting lag.
One user was able to perform a manipulation task, stacking plastic cups inside one another, using a robot 66km away.
In additional studies, 18 novice users were able to complete the cup-stacking task 66 per cent faster in virtual reality compared with a traditional keyboard-and-monitor interface.
Users also reported enjoying the virtual interface more, and they found the manipulation tasks to be less demanding compared with keyboard and monitor.
“In VR, people can just move the robot like they move their bodies and so they can do it without thinking about it,” Rosen said. “That lets people focus on the problem or task at hand without the increased cognitive load of trying to figure out how to move the robot.”