Millbrook takes the virtual track for autonomous vehicle testing
Image credit: rFpro
With autonomous cars closer to reality than concept, Millbrook has made a digital twin of its test track so this new breed of vehicles can be safely put through their paces.
You can’t put a price on anyone’s life, so it is quite understandable that we expect car manufacturers to treat safety as a basic requirement – not a ‘nice to have’. The testing required to get to this level is already considerable, but the arrival of autonomous cars will see test demands increase dramatically.
Imagine the tests required to take a normal car to market. For its Ingenium engines alone, Jaguar went through two million kilometres of testing during the five years of development. Using the obligatory comparison, that is 50 times around the Earth. Consider that every iteration of a car, whether it is a fundamental change to the chassis or a tweak to ECU code, requires a new set of tests. That represents a lot of hard miles.
However, not all of Ingenium’s tests were carried out on road or test bed of course. Much was done virtually. Realistically, doing simulations of performance is the only way to compress the development period into a practical time-to-market window. Without it, cars would be obsolete before they were rolling off forecourts.
Track testing will always have to be part of the mix, and the Millbrook proving ground, near Milton Keynes, is one of the places where the automotive industry goes to try out its latest designs. The facility is now a far cry from the bare hillsides that were used for performance testing of cars, buses and trucks when it opened in 1970. The intervening years have seen the testing capabilities and the range of driving environments advance considerably.
Despite having such a ‘go-to’ facility, Millbrook appreciated that the demands for modern testing, coupled with the capabilities of digital technology, presented an opportunity and not a threat. Millbrook was to have its own digital twin.
Pete Stoker, chief engineer for connected and autonomous vehicles at Millbrook, explains the premise: “What’s happening is industry driving us towards a digital future. We became involved in some digital work in the past but actually we now need to understand how this proving ground can work and how we can get this into people’s development cycles. So we can actually develop a virtual proving ground for them to use before coming here to verify.”
This is not a new mindset for Millbrook. Several years ago it worked with a government agency to scan parts of the site as a low-grade survey, but it required expensive CAD packages and heavyweight computing power, neither of which Millbrook had, so with a lack of customer interest the project was abandoned. It did give the team an insight and five years ago it started looking at the possibilities again.
The market has moved on – it’s not just complete cars that need putting through their paces. “We know the future customers are really not limited to the digital vehicle,” says Stoker. “So we’re seeing enquiries from digital sensors, algorithms, sensor fusion and so on, that just don’t have physical vehicles. We’re seeing lots of different things coming out, new growth areas.”
Moreover, the technology to cover these separate growth areas needed to be brought together in a single environment for virtual testing. Ensuring that everything works in simulation means everything is almost certain to work when the real vehicle hits the roads at Millbrook. This involves building up a library of simulated regression tests that correlate with the subsequent physical tests.
Chris Hoyle is technical director of rFpro, a specialist driving-simulation company working with Millbrook on the digitalisation project. He observes that this library of tests needs to be particularly large for autonomous vehicles: “It’s related to scale, to the quantity of test miles that will be required to prove your safety, hundreds of millions of miles. But it also provides you with the tools to ensure the autonomous vehicle is not just safe but that it will achieve a consumer acceptance – that it exceeds the human driver’s performance for comfort, for head toss, for passenger comfort KPIs [key performance indicators].”
There are three stages to developing Millbrook’s simulated testing process. The first is to model the real-world test site. Second is to physically model a vehicle’s interaction with that real world through its sensors and its vehicle dynamics. The third step is to scale testing massively.
To collect and collate a model of the real world, rFpro conducted a kinetic lidar (laser scanning) survey using a process of 3D reconstruction with geo-referenced spherical photography. The result is a physically modelled world with materials that translate accurately and can change according to the conditions and lighting. Wet roads, glare and reflection, therefore, all come into the possible simulation scenarios.
Accurate, high-frequency, phase-based lidar is used to capture the road and kerb detail that is important for vehicle dynamics applications. Longer range time-of-flight lidar is used to capture the roadside furniture and scenery.
“Underneath the graphical model there’s a high-resolution engineering surface,” says Hoyle. “One sits on a resolution grid accurate to 1mm, and that’s going to feed your vehicle dynamics model.” In other words, it simulates the effect of the road surface and condition.
Step two is to physically model a vehicle’s interaction level with that virtual world. This will be increasingly important as rival makers of autonomous vehicles start to look at the passenger experience as a differentiator. Information from the real world, therefore, needs to be collected so the real vehicle can enter the digital model.
“You can create ‘reward’ functions for ride comfort and head toss, so customer acceptance is also in the development process, not just safety,” says Hoyle. “You can position any number of sensors around the vehicle, any position, any orientation, any field of view, any refresh rate. That’s how we physically model your vehicle’s interaction with the world.”
The third stage is to prove that the vehicle will be safe for driving on public roads. Even from a data-collection perspective this is a big challenge. In the last couple of years, tests at Millbrook have seen sensor numbers rise from maybe one to as many as six, with each one generating 10GB rather than 1GB. Then, throw into the simulation the prospect of the virtual autonomous vehicle facing a 1,000-year event every ten seconds.
It is just a matter of scaling, according to Hoyle: “So start with one vehicle, three sensors, one CPU, one GPU. Add three more sensors to this vehicle, we’ve added a second GPU. Add a second vehicle in the test, with its own sensors, another CPU, another GPU. It just keeps scaling.” Adding further features, like background noise or virtual pedestrians, requires further CPUs for each.
Such is the capability of modern computing that the hardware is no longer the limiting factor it was only a few years ago – resources in the cloud (the Millbrook project is hosted by AWS) are effectively limitless.
As a final twist, the simulation allows the input of human test drivers. “I know of no better way to identify failure modes in autonomous vehicles than to throw human drivers at simulated tests,” claims Hoyle. “We’re random. We make mistakes. Our moods change. There’s up to 50 human drivers in an experiment at the moment and we’ll have reached 250 [in the coming months]. The best thing about having human drivers to test in simulation is no-one dies!”
Thus, Millbrook’s digital twin has been born – a simulation environment where autonomous vehicles can experience thousands of hours of accelerated virtual testing while being subjected to varying driving conditions and interacting with other vehicles and pedestrians.
In many applications of digital twins, the simulation supports the real-life product. At Millbrook the model is used in a different way, as here the simulation and the real-life product are interdependent components in the same development process.
Now in its third year, the annual collaboration between the IET, the High Value Manufacturing Catapult and Immerse UK focuses on the cross-sector adoption of immersive technologies, with digital twins set to be one of the major talking points this year. Registration is now open.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.