Virtual reality enters the real world

Applications are being developed for virtual environments that can speed time-to-market, operational efficiency, and engender innovative business practices, discovers E&T.

For too long, virtual reality (VR) had an image problem. The public perception was of geeks with clunky stereoscopic displays strapped to their heads, inwardly exploring lurid computer-simulated environments, while outwardly moving like an arthritic marionette.

But, as this decades-old technology builds bridges (real and simulated) with its near-relative virtual worlds (VW), industry and the applied sciences are responding to the opportunities VR opens up to creating applications designed to bring operational and administrative efficiencies.

VR has been used by design engineers in sectors such as automotive and aeronautics for research and product development for years, initially in the form of computer-aided design and manufacturing systems, and more recently for virtual prototyping. VR-like systems are a training mainstay of several professions, most notably the armed forces.

These bespoke specialist systems are resource-intensive and expensive, affordable only by big agencies with big budgets. But the adoption of online virtual worlds like Linden Labs' Second Life by many corporate entities as an approved extension of their business models has highlighted virtuality's potential as a business enabler for all sizes of organisation.

The growth of the online and console gaming market is another factor in the interest in VR/VW work tools, says Dan Riley, metaverse architect at DLAB (part of the Institute of Digital Innovation at the University of Teesside), because the hardware and software behind them have much to offer VR applications.

"The games and virtual world industries are going to collide," Riley predicts. "Games engines - like Xbox and Playstation - are ideal platforms from which to access virtual environments applications for serious business contexts."

The big console manufacturers - Microsoft and Sony - and the games companies, have invested billions in trying to perfect representational and interactive technologies that drive the virtual scenarios of their products. Much of this functionality could be incorporated into an application designed for, say, industrial training techniques.

There are obvious similarities between common games scenarios, for example exploring a labyrinth of tunnels and a risky job like sewer maintenance. Riley observes: "New starters to the sewer maintenance team could gain safe exposure to a network in a virtual representation, accompanied by mentoring avatars, before they set foot underground. This will help their confidence, and mean that they are less apprehensive when they first encounter the real thing."

Virtual office politics

As well as simulating stressful physical environments, VR/VW is also being directed at the often equally traumatic office space.

The potential for collaborative interaction in immersive or semi-immersive domains is one of the areas under investigation by the Virtual Environments and Computer Graphics (VECG) group in the Department of Computer Science, University College London. The VECG group's research spans the range from real-time computer graphics rendering to human factors issues in virtual reality.

One of the areas the group focuses on is making our virtual characters realistic, so users want to engage with them virtual people that carry out gestures and movements that humans ordinarily take for granted. The group's ultimate goal is a theory of virtual reality: to make it work in a given application context and with given resources, what is the best approach to take, what is the best algorithm, interaction and rendering style to use, asks VECG head Anthony Steed.

In terms of the challenges this presents to VR engineers, Steed identifies three technical areas that pose the principle areas of research. "First, projector technology needs to become as adept at showing real-time computer graphics as it is video. This will enable virtual environments to be more effectively projected onto large surfaces such as walls and ceilings. Latency and frame rates are also an issue there, and represent another technological challenge. And tracking technology has to get better so that bodily movement can be more accurately represented."

The question of whether VR/VW can provide a more effective alternative to videoconferencing is also being investigated. "Replicating eye movements is a big challenge," believes Steed. "These small things are big issues, because if you want people to interact in close collaboration in a virtual conference, then reactive eye movement is a key way of getting users to engage with the system, and acquire the confidence to do business in a virtual domain."

Make-believe medical practice

Medicine and healthcare are other sectors where VR/VW tools are being applied. Examples include anatomical educational tools; patient education; diagnostic aids; virtual autopsies; planning and guidance aids; training; and computer augmented reality.

The accredited term, cybermedicine, is typically applied to a range of medical applications that make use of virtual environments and computer graphics, says Nigel John of the University of Wales School of Computer Science at Bangor.

"Performing a dissection of a human cadaver, [for example], has traditionally been considered as the optimum method for students to gain an excellent spatial understanding [of anatomy] that is difficult to glean from a text book alone," explains Professor John. "Yet dissection has become less common due to financial and ethical reasons, and some medical schools have taken the decision not to use cadavers in its undergraduate curriculum."

A potential substitute for dissection is to use cybermedicine tools. "At Bangor, we are developing a mixed reality anatomy teaching tool. A novel interactive environment allows a student to use a plastic model of an organ to manipulate the position and orientation of a volume rendering of the anatomy (instead of using a keyboard and mouse). The volume rendering of the whole data set or just the organ of interest is overlaid onto the video stream being displayed by the computer.

"The volume rendering can also be clipped relative to an arbitrary plane to reveal data from its interior, using a second prop (such as a plastic rule) as the clipping device. We are extending these ideas to include an anatomy segmentation tool based on 3D colouring software."

Further information [new window] [new window] [new window] [new window]

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them