VR in the workplace
Workplace VR goes far further than digital-paintball team-building excursions. Faster servicing is only the beginning.
After conducting research that found service technicians spent almost a third of their time looking up information on a product they maintain and trying to digest the information, vehicle maker Caterpillar began to look at ways to make its manuals easier to understand.
One option was to use video. Jim Wagner, visualisation architect for service publications at Caterpillar, says millennials will often ask, instead of reaching for a printed manual, “don’t you just have a YouTube clip I can watch?”
However, Caterpillar, in common with a number of other companies, doesn’t think the sit-back, absorb-little nature of video quite does the trick. So the company is developing a number of augmented-reality (AR) applications based on tools from CAD specialist PTC that will couple live video with animations that show how to dismantle gear, repair it and put it back together. “You have a model placed in a context where you can understand it much more easily,” says Wagner.
Today’s versions of workplace augmented reality (AR) rely on handheld tablets being held over the equipment. Hexagonal icons stuck on the bodywork tell the applications which make and model they are dealing with and help orient the software to the live image so they can overlay the animated graphics correctly. As digital eyewear becomes cheaper and robust enough to use in the workplace, the wearables will take over from tablets.
Aircraft maker Airbus sees digital eyewear as being a major part of its ‘Factory of the Future’ programme. Airbus is faced with the issue that fuselage assembly is manually intensive and involves the use of thousands of fasteners, each of which needs to be tightened correctly. A graphics overlay can help guide the worker to parts of the airframe that need attention and provide live feedback on the amount of torque that needs to be applied for each operation and give the go-ahead for the next fastener in line once each one is finished.
Some applications will call for feedback through the network. An AR application developed by Sysmex for its medical equipment shows the operator how to clean and unblock its fluid delivery systems. As some steps need parts of the unit to be powered down first, the machine reports its status to the AR application using the network so that it only shows the next step once the machine is in the right state.
Similar applications are likely to move into retail. In a project for a UK supermarket chain, PA Consulting developed a prototype based on Google’s Glass eyewear to help supermarkets and other stores check how stock is being presented. Big suppliers often have agreements that control how their goods are displayed in the stores, from pricing through to the location and size of special displays. Today, it involves a lot of paperwork.
The headset works out where to guide the wearer using Bluetooth signals sent by electronic beacons. When they arrive at the shelf, the software shows the worker an image of how the display should look. If it passes, they can tell the software it is okay. If not, they can check whether stock needs to be brought up from the warehouse.
In these applications, user interface design will be crucial. The dividing line between help and being patronised by the computer is pretty narrow. We can only hope that augmented-reality applications do not suffer from ‘Clippy’ syndrome: “It looks as though you’re trying to extract a fuel rod from an overheating reactor, would you like some help with that?”