Volvo Cars and Varjo launch world-first mixed-reality application for car

Virtual reality hits the road (but hopefully not the moose)

Image credit: Volvo Cars/ Varjo

Companies famous for their virtual worlds are now helping to build or manage real ones. Unity, whose real-time 3D-visualisation engine is behind half of the world’s games downloads, is moving into engineering, architecture and perhaps next smart cities. E&T spoke to Brett Bibby, VP of engineering, and Tim McDonough, general manager for industrial, at their annual developer conference in Copenhagen.

Volvo is using Unity software and the Varjo augmented-reality headset to design its next vehicle models – inside as well as outside. I put the headset on and the AR car appears next to the real one in front of me – complete with reflections of its real surroundings. A couple of clicks and the conference hall disappears and I’m now in the back streets of an old town.

Inside the car, I can even see the detail of the stitching on the gear shift. Yet everything can be altered with a few clicks, so designers can user-test absolutely everything about the car under every imagineable weather or location, all with real-time, 3D interactive AR. Built-in eye-tracking technology follows the driver’s gaze, too, so designers can assess different dashboard layouts, for example.

E&T: How are you expanding your clients from games developers to design engineering and other new markets?

Brett Bibby: “Anybody who is a pioneer that wants real-time 3D has been using our tools for a long time, so I don’t know that Unity is moving there as much as it’s being more widely adopted.

The way I like to think of it is people were successful in spite of us, not because of us, and now we’re going to try and make it less difficult for people. We’re leaning into it.”

Tim McDonough: “We were looking at who was buying Unity from our website and it was Volvo and Volkswagen, BMW, Atkins, Dassault Aviation, Airbus and Boeing. Those aren’t games companies, what are they doing? They were doing just amazing things that we hadn’t imagined they would.

We hired a dedicated engineering team and now we have a full business unit supporting the manufacturing industry and supporting architecture, engineering and construction.

E&T: How are these companies using Unity’s technologies?

TM: “We started being used in the design studio: ‘I want to visualise my product and I want to see it in a high-frame-rate environment.’ Particularly in architecture and product design, they want to see it one-to-one scale. They don’t want to look at a table-top model, they want to walk through it.

The second use case is training. Once I’ve designed a car, I’m getting my factory line refactored. I want to train my workers because there’s nothing more expensive than an idle factory. You can take the design data, put it in an app, use it for training, so your time to productivity is that much faster.

Third is sales and marketing. I can take CAD data, create a Unity model of a skyscraper, put all the materials in and I can build an apartment configurator. You can choose your floors and your appliances; I can configure my apartment. Then, if I want to sell, what’s the view on the 30th floor versus the 10th floor? As I go up and down, I see the view change. I can see how the light looks. I get a virtual view of what that begins to look like so I have the confidence to spend money.

Number four is training autonomous vehicles. There’s a consortium of companies collaborating on autonomous vehicles (Baidu, Audio, BMW, Volkswagen, Toyota etc) called the Apollo network. They’re basically using Unity to train autonomous algorithms. You can’t physically drive enough miles and 99 out of 100 miles are kind of useless because you’re parked, you’re at a stop sign and so on. That doesn’t teach an algorithm how to not hit a pedestrian. But we can simulate things at scale on the cloud.

So, if you want to train a million cars overnight and have them drive for X number of hours, come back in the morning and you’ve got all the data. We could never physically go from a million cars to zero cars in the real world. You can make all of those miles useful. Instead of running 99 minutes of boring and one minute of interesting, we run 30-minute scenarios that are 30 seconds with the tough parts where a child crosses the street, it’s dark, it’s raining, it’s snowing, it’s bright.

That gets to my next point: we can train on things that you can’t train in the real world. How do you train to not hit a kid without throwing a kid – I’ve seen people throw dummies – in front of cars? Volvo’s scenario is the moose: how do I not hit a moose?

What is the in-car experience going to be when cars drive themselves? What are all the screens going to do? Is the windshield going to be a screen or an augmented-reality display? Auto-makers today are using us to design those entertainment experiences and construct a prototype with level 5 autonomy.

E&T: How is modelling the real world, or the prototype real world, different from building virtual worlds?

BB: “Problems architects have been trying to solve are similar to the problems that we’ve been solving but we just didn’t realise that, and I think from that perspective, it’s not a big shift. The problems of real world tend to be much more sizeable than games. Games was all smoke and mirrors. Not as smoke and mirrors as movies, because in a movie, the camera can only be in one place. So, if you’ve ever been on a movie set, you just have to look a few feet off and you can see immediately it will fall apart. In games, somebody could look under the table or whatever, so, it’s a little bit less smoke and mirrors.

When you get to buildings, this is the real deal. I remove that ceiling panel, I’m sure there’s something up there – that amount of complexity [and] detail is something that games are also wanting to put in, so it’s this really beautiful interaction where games has this technology to bring worlds to life that was cut on just taking advantage of hardware as best we can.

Meanwhile, the architectural engineering industry has just been building incredible structures and advancing the art of architecture and engineering and now they’re coming back together in incredible ways. I don’t think it’s a big shift in terms of the thinking; it’s a natural evolution of where we’re going.

E&T: In games, like in animated cartoons, you can repeat reality if you wish to save data space, but how about in the real world?

BB: “Our biggest challenge in engineering is just the scale of the data. It’s massive. I was in Hong Kong a couple of years ago, working with an architect who was doing the Hong Kong subway system, and they had a train station and they had the tile. It’s one thing but they stamp that thing out 200,000 times. There’s 200,000 tiles in this thing, so you just end up with these massive data sets. In games, we have this thing called instancing: we just say that is object A, and I want A, A, A, A, A... I just create 35 As and there you go, that’s what makes it real time.

I think there’s still a lot of innovation to be done where we can take large scenes and we can reduce that complexity down, so that we can explode it back in real time. It’s almost like a compression. Then go back to the manufacturer and get their actual render profiles and I can figure out what kind of glaze do they use, what is the colour, what is the variation. There’s meta data associated with all of these parts. So, you’ve got the visual things and the architecture brings how it’s actually going to fit together. If you marry the two, it’s going to be pretty special.”

E&T: We’ve been talking about modelling the real world for real people in real-time, but could you take away the VDU and give those models to machines to read and make everything in a smart city, for example, run together efficiently, quickly and safely?

TM: “Two years ago I had no idea that architectural engineering business would converge with our manufacturing business. But smart city simulation and autonomous vehicle simulation is all the same base data. What happens in a 5G autonomous car when it is connected to a smart city? Let’s say you’re in an autonomous car: it can only see as far as its sensors can go but the city knows there’s a fire truck coming. If the city can tell the car something that the car can’t see, the car now has longer vision and you can make smarter decisions about avoiding the problem or rerouting. An AR experience sitting on the dash can show how the data has been fused together, because consumers may need to understand what the car is doing and why, so they can trust it.”

E&T: When you started out coding games 30 years ago, would you have believed you’d be doing it for the real world in the future?

BB: “You know, I absolutely believed it and maybe we all did, because you’re trying to achieve reality and then bend it to your will. Go and see ‘Avatar’ and then you imagine these crazy plants and creatures that fly or whatever, but what they’re trying to do is manifest a reality so real, that when you sit in the theatre, you are in that place.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles