Remote surgery, car design and 3D model face scanning demonstrated at VR World
E&T Magazine attended the VR World conference in London this week, which featured an array of companies demonstrating concepts that make unique use of virtual reality (VR) technology.
By 2021, analysts predict that around 70 per cent of the VR and augmented reality (AR) market will be through enterprises for commercial or business related reasons, rather than gaming, the sector most commonly associated with the technology today.
UK company Generic Robotics demonstrated its haptic feedback system that allows users to pick up and ‘feel’ virtual objects.
Users slide their thumb and index finger into a robotic arm-like device and see them represented on screen with two small blue spheres. The demo presented a table full of objects that could be gripped, picked up and flung around with impressively life-like physics.
The robotic arm became stiffer when trying to pick up larger objects, which prevented the user from pinching as hard. The system provided a realistic feel to virtual objects.
It was also adapted as a dental training tool, presenting an open-mouthed patient with the aim to administer an injection into her gums.
The arm prevented movement when the needle came near flesh to prevent clipping through her face. Pushing down on the fake syringe provided a realistic, if slightly unnerving experience.
Next up was Zerolight, which is working with major players in the auto industry, such as Toyota and Volkswagen.
The firm is trialling technology that will allow potential customers to customise a vehicle in real time as well as view an accurate model of the car in detail.
Using an HTC Vive controller wand, car doors and bonnets could be lifted up to see underneath and users could peer inside the vehicle and view it from any angle. Features could also be customised, such as the body colour and the inside finishes.
Although not technically VR, Swiss company Astrivis demonstrated some impressive 3D rendering technology using nothing more than a smartphone.
The in-development app used the selfie camera to create a detailed 3D model of a face.
Users moved the camera so that it captured their face at every angle, a process that took around 30 seconds, before it generated a 3D model. Once the model was created, the app layered the skin and hair textures accurately over the face.
The app was fast and the model generated appeared true to life - no small feat considering it was only using data from a relatively low resolution front-facing camera. In theory the models could eventually be used in computer simulations or games to create a lifelike representation of a player’s face.
Lastly Vicon offered a fresh take on a 3D motion capture system for use in AR and VR that used solely infrared to track movements.
The representative claimed the system was far cheaper than traditional motion capture technology that is typically used in films and the gaming industry today.
The tracking was very accurate, with only a small degree of latency between the actions of the demonstrator and the physical mapping that was then applied to the 3D model.