Tesla’s self-driving software ‘lacks safeguards’, safety experts claim
Tesla’s self-driving software “lacks safeguards” according to safety experts working for Consumer Reports.
Elon Musk’s electric car firm started rolling out over-the-air updates earlier this month to enable “full self-driving” for eligible owners, albeit in a beta form.
Owners are still obliged to be able to take control of the vehicle at a moment’s notice, and Musk admitted on Twitter that there could be “unknown issues” that could need to be addressed in the future.
Consumer Reports (CR) said it plans to independently test the software update, known as FSD beta 9, once its in-house vehicle has received it, but its experts have already expressed concern over footage of the car’s behaviour that has emerged online. They saw it scrape against bushes, miss turnings and even head towards parked cars.
Tesla’s vehicles already had some level of autonomy when travelling on highways, but the FSD beta 9 allows more automation of driving tasks including the ability to allow it to navigate intersections and city streets.
“Videos of FSD Beta 9 in action don’t show a system that makes driving safer or even less stressful,” said Jake Fisher, senior director of CR's Auto Test Center. “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.”
In April, CR said its engineers had managed to defeat safeguards implemented by Tesla to prevent the driver from leaving their seat while the vehicle is in motion.
MIT professor Bryan Reimer, who researches vehicle automation, told CR that “while drivers may have some awareness of the increased risk that they are assuming, other road users – drivers, pedestrians, cyclists, etc – are unaware that they are in the presence of a test vehicle and have not consented to take on this risk”.
Rather than using expensive lidar hardware, the approach favoured by other driverless tech firms such as Google’s sister firm Waymo, Tesla’s Autopilot uses cameras, ultrasonic sensors and radar to see and sense the environment around the car.
Autopilot is considered to be a Level 2 ‘partially automated’ system by the Society of Automotive Engineers’ standards.
In April, two men being transported in a Tesla, who were believed to have been relying on its driverless abilities, died in a crash in Texas, US.
Yesterday, Musk said on Twitter that he plans to open his firms’ network of superchargers to other electric vehicles later in the year.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.