Drones trained to perform quick manoeuvres using virtual environments
MIT engineers have developed a virtual-reality training system for drones that enables a vehicle to “see” a rich, virtual environment while flying in an empty physical space.
Training drones to fly fast, around even the simplest obstacles, is a crash-prone exercise that can have engineers repairing or replacing vehicles with frustrating regularity.
The new system, which the team has dubbed ‘Flight Goggles’, could significantly reduce the number of crashes that drones experience in actual training sessions. It can also serve as a virtual testbed for any number of environments and conditions in which researchers might want to train fast-flying drones.
“We think this is a game-changer in the development of drone technology, for drones that go fast,” said Sertac Karaman, associate professor of aeronautics and astronautics at MIT.
“If anything, the system can make autonomous vehicles more responsive, faster and more efficient.”
Karaman was initially motivated by a new, extreme robo-sport: competitive drone racing, in which remote-controlled drones, driven by human players, attempt to out-fly each other through an intricate maze of windows, doors and other obstacles.
“In the next two or three years, we want to enter a drone-racing competition with an autonomous drone and beat the best human player,” he said. To do so, the team would have to develop an entirely new training regimen.
Currently, training autonomous drones is a physical task. Researchers fly drones in large, enclosed testing grounds, in which they often hang large nets to catch any careening vehicles.
They also set up props, such as windows and doors, through which a drone can learn to fly. When vehicles crash, they must be repaired or replaced, which delays development and adds to a project’s cost.
Karaman says testing drones in this way can work for vehicles that are not meant to fly fast, such as drones that are programmed to slowly map their surroundings. For fast-flying vehicles that need to process visual information quickly as they fly through an environment, however, a new training system is necessary.
“The moment you want to do high-throughput computing and go fast, even the slightest changes you make to its environment will cause the drone to crash,” Karaman said.
“You can’t learn in that environment. If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.”
The team’s new virtual training system comprises a motion-capture system, an image-rendering program and electronics that enable the team to quickly process images and transmit them to the drone.
The actual test space - a hangar-like gymnasium in MIT’s new drone-testing facility in Building 31 - is lined with motion-capture cameras that track the orientation of the drone as it’s flying.
With the image-rendering system, Karaman and his colleagues can draw up photorealistic scenes, such as a loft apartment or a living room and beam these virtual images to the drone as it’s flying through the empty facility.
“The drone will be flying in an empty room, but will be ‘hallucinating’ a completely different environment and will learn in that environment,” Karaman explained.
The virtual images can be processed by the drone at a rate of about 90 frames per second - around three times as fast as the human eye can see and process images.
To enable this, the team custom-built circuit boards that integrate a powerful embedded supercomputer, along with an inertial measurement unit and a camera. They fit all this hardware into a small, 3D-printed nylon and carbon-fibre-reinforced drone frame.
The researchers carried out a set of experiments, including one in which the drone learned to fly through a virtual window about twice its size. The window was set within a virtual living room. As the drone flew in the actual, empty testing facility, the researchers beamed images of the living room scene, from the drone’s perspective, back to the vehicle.
As the drone flew through this virtual room, the researchers tuned a navigation algorithm, enabling the drone to learn ‘on the fly’.
Over 10 flights, the drone - flying at around 2.3m per second (five miles per hour) - successfully flew through the virtual window 361 times, only ‘crashing’ into the window three times, according to positioning information provided by the facility’s motion-capture cameras.
Karaman points out that, even if the drone crashed thousands of times, it wouldn’t make much of an impact on the cost or time of development, as it’s crashing in a virtual environment and not making any physical contact with the real world.