Trust AI driverless car road info etc

Projected images of pedestrians trick driverless cars into applying emergency brakes

Image credit: Getty Images

Driverless cars can be tricked into emergency braking by projecting 'phantom' images onto the road that trick the autopilot into thinking a pedestrian is standing in the vehicle's path, researchers have found.

The team from Ben-Gurion University of the Negev’s (BGU) Cyber Security Research Centre found that the autopilot systems register depthless projections of objects (phantoms) as real objects.

They show how attackers can exploit this perceptual challenge to manipulate the vehicle and potentially harm the driver or passengers without any special expertise by using a commercial drone and an inexpensive image projector.

While fully and semi-autonomous cars are already being deployed around the world, vehicular communication systems that connect the car with other cars, pedestrians and surrounding infrastructure are lagging behind.

phantom pedestrian

Image credit: Ben-Gurion University of the Negev

According to the researchers, the lack of such systems creates a “validation gap”, which prevents the autonomous vehicles from validating their virtual perception with a third party and is forced to rely only on internal sensors.

In addition to causing the autopilot to apply brakes, the researchers demonstrated they can fool the autopilot system into believing phantom traffic signs are real when projected for 125 milliseconds in advertisements on digital billboards.

Lastly, they showed how fake lane markers projected on a road by a projector-equipped drone will guide the autopilot into the opposite lane and potentially oncoming traffic.

“This type of attack is currently not being taken into consideration by the automobile industry. These are not bugs or poor coding errors, but fundamental flaws in object detectors that are not trained to distinguish between real and fake objects and use feature matching to detect visual objects,” said Ben Nassi, lead author on the research.

Depthless objects projected on a road are considered real, even though the depth sensors can differentiate between 2D and 3D. The BGU researchers believe that this is the result of a “better safe than sorry” policy that causes the car to consider a visual 2D object real.

The researchers are developing a neural network model that analyses a detected object’s context, surface and reflected light, which is capable of detecting phantoms with high accuracy.

In 2019, engineers demonstrated a system that helps driverless cars minimise injuries and damage in the event of an unavoidable crash.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles