Transportation

Tesla Crash Investigators Slam Autopilot Deficiencies, Lack Of U.S. Rules For ‘Partially Automated’ Cars


Top U.S. safety investigators reviewing numerous Tesla crashes in which drivers were using the company’s Autopilot feature at the time of those accidents are highly critical of the system’s technical limitations and the Transportation Department’s failure to set rules for so-called partially automated drive systems.

The National Transportation Safety Board has been analyzing four different accidents, including a March 2018 crash in which Tesla Model X owner Walther Huang died when his vehicle drove straight into a traffic barrier in Mountain View, California, that it failed to detect. Huang was relying on Autopilot to drive for him, and apparently playing a game on his phone at the time of the accident. NTSB determined Autopilot lacks the ability to monitor whether drivers are paying attention and that Tesla doesn’t limit its use to specific conditions, such as highway-only driving. Investigators also blasted the DOT’s National Highway Traffic Safety Administration for ignoring its requests to set rules for Autopilot and similar systems.

Huang’s “tragic crash clearly demonstrates the limitations of advanced driver assistance systems that are available to consumers today. There is not a vehicle currently available to U.S. consumers that is self-driving. Period,” NTSB Chairman Robert Sumwalt said at the conclusion of Tuesday’s hearing in Washington. “We urge Tesla to work on improving its Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken when necessary. It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars. Because they don’t have driverless cars.” 

While consumers await the arrival of true self-driving robotic vehicles that don’t require a human at the wheel, Elon Musk has touted Tesla Autopilot as the most sophisticated partially automated driving system currently available, even boasting last April that it would achieve “full self-driving” capability by the end of 2019. That didn’t happen. Autopilot has, however, been tied to numerous accidents starting with a fatal 2016 crash in Florida in which a distracted Model S owned died when his car slammed into a truck crossing his path. NHTSA investigated that accident but didn’t find Tesla at fault, in part because of the lack of guidelines for Autopilot-like systems.  

NHTSA, the main regulator for U.S. auto safety, hasn’t proactively taken steps to ensure automated driving technology is safe and used properly, said NTSB Director Rob Molloy. “Fixing problems after people die is not a really good highway approach.”

Tesla didn’t immediately provide a statement responding to the NTSB hearing. Huang’s family filed a wrongful death suit against Tesla last May, claiming Autopilot is defective technology. 

Additionally, NTSB wants smartphone makers to do more to prevent drivers from using the devices when they’re at the wheel, such as temporarily disabling some features to prevent distraction. Cell phone use by drivers has become an increasing cause of accidents and the main reason use road deaths have increased in recent years. 

Consumer Reports joined the NTSB in seeking tougher rules for partially automated driving technology and steps to reduce driver distraction. 

“The evidence is clear, and continuing to pile up, that if a car makes it easier for people to take their attention off the road, they’re going to do so—with potentially deadly consequences,” said  

Ethan Douglas, senior policy analyst for cars and product safety at Consumer Reports. “This shouldn’t be considered optional. Manufacturers and NHTSA must make sure that these driver-assist systems come with critical safety features that actually verify drivers are monitoring the road and ready to take action at all times. Otherwise, the safety risks of these systems could end up outweighing their benefits.”



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.