Robot-VR system could allow factory workers to telecommute
Image credit: MIT CSAIL
Factory workers could do their job while sitting at home wearing an Oculus Rift, thanks to a virtual reality (VR) system developed by researchers at Massachusetts Institute of Technology (MIT).
While many office-based workers now have the option to work from home, employees in manufacturing jobs have less flexibility as this type of hands-on work typically cannot be completed without their physical presence.
Low-skilled factory jobs will be the first to vanish with the oncoming tide of automation, although researchers from the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT have now revealed a VR system which allows factory workers to work with robots, by teleoperating a robot in its place.
“A system like this could eventually help humans supervise robots from a distance,” said Dr Jeffrey Lipton, who worked on the system at CSAIL.
“By teleoperating robots from home, blue-collar workers would be able to telecommute and benefit from the IT revolution just as white-collar workers do now.”
The MIT system uses an Oculus Rift headset and displays a “VR control room” full of sensor displays around the user. The workers use gestures to control the robot, which follows their movements in real time.
There are two standard approaches to using VR for telecommuting: the direct model, where the user’s vision is strongly coupled to that of the robot, which suffers from limited view and delays, and the cyber-physical model, in which the user interacts with a virtual model of the robot and environment, which requires far more data.
This new system sits at a half-way point between the two approaches: it is similar to the “homunculus model of the mind”. This is the idea that there is a tiny person living inside our heads, seeing and hearing all that we do, and controlling our actions. This otherwise obsolete model is useful when it comes to approaching robotic control. When the worker puts on the VR headset, they step into a virtual environment designed to feel like the inside of a robot’s brain.
This approach solves the complications with delay associated with the direct model, as the user is constantly receiving visual feedback, while also allowing the employee to feel merged with the robot.
The CSAIL team tested out their system – teleoperating a robot to pick up screws and wires, and pick up and stack blocks – and found that their system performed better than current teleoperating systems at grasping objects and completing tasks.
They also demonstrated that the system worked even at enormous distances, testing it in a Washington DC hotel to control a robot in Cambridge, Massachusetts.
Although the team only demonstrated this system using Rethink Robotics’ Baxter robot – an approximately humanoid industrial robot designed to complete basic jobs on a production line – and the Oculus Rift headset, the approach could work on other platforms; they hope to make the system scalable, such that users of different types of robots can be made compatible with automation technologies.
The researchers suggest that by gamifying factory work, the system could attract unemployed gamers – who performed well with the system – to begin working in manufacturing.