
Shadow detection system lets autonomous vehicles see around corners
Image credit: Dreamstime
A system has been developed to allow driverless cars to anticipate when vehicles or people are coming around the corner by analysing changes to shadows on the ground.
Autonomous cars are often said to be safer than human drivers due to their near-instant reaction times and decision making. Further improving their understanding of the surrounding environment should help to avoid situations such as the fatal Tesla crash which occurred in 2016 when the car’s autopilot feature failed to differentiate the white side of a tractor trailer from the brightly lit sky behind it.
Now, engineers from the Massachusetts Institute of Technology (MIT) have developed a new system they say is superior to lidar and can improve stopping times by more than half a second. While this doesn’t sound like a lot, fractions of a second can make a significant difference in an accident scenario when it comes to fast-moving autonomous vehicles, the researchers said.
“For applications where robots are moving around environments with other moving objects or people, our method can give the robot an early warning that somebody is coming around the corner, so the vehicle can slow down, adapt its path and prepare in advance to avoid a collision,” said researcher Daniela Rus. “The big dream is to provide ‘X-ray vision’ of sorts to vehicles moving fast on the streets.”
Currently, the system has only been tested in indoor settings. Robotic speeds are much lower indoors and lighting conditions are more consistent, making it easier for the system to sense and analyse shadows.
For their work, the researchers built on their system, named 'ShadowCam', that uses computer-vision techniques to detect and classify changes to shadows on the ground.
The system uses sequences of video frames from a camera targeting a specific area, such as the floor in front of a corner. It detects changes in light intensity over time, from image to image, that may indicate something moving away or coming closer.
Some of those changes may be difficult to detect or invisible to the naked eye and can be determined by various properties of the object and environment.
ShadowCam computes that information and classifies each image as containing a stationary object or a dynamic, moving one. If it gets to a dynamic image, it reacts accordingly.
Adapting ShadowCam for autonomous vehicles required a few advances. The early version, for instance, relied on lining an area with augmented reality labels called “AprilTags,” which resemble simplified QR codes. Robots scan AprilTags to detect and compute their precise 3D position and orientation relative to the tag. ShadowCam used the tags as features of the environment to zero in on specific patches of pixels that may contain shadows. However, modifying real-world environments with AprilTags is impractical.
The researchers developed a novel process that combines image registration and a new visual-odometry technique. Often used in computer vision, image registration essentially overlays multiple images to reveal variations in the images.
The researchers specifically employ 'Direct Sparse Odometry' (DSO), which can compute feature points in environments similar to those captured by AprilTags. Essentially, DSO plots features of an environment on a 3D point cloud and then a computer-vision pipeline selects only the features located in a region of interest, such as the floor near a corner.
The researchers implemented ShadowCam in an autonomous car in a parking garage, where the headlights were turned off, mimicking night-time driving conditions. They compared car-detection times versus lidar. In an example scenario, ShadowCam detected the car turning around pillars about 0.72 seconds faster than lidar. Moreover, because the researchers had tuned ShadowCam specifically to the garage’s lighting conditions, the system achieved a classification accuracy of around 86 per cent.
The researchers said they will develop the system further to work in different indoor and outdoor lighting conditions.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.