Transportation

Autopilot Or Not, Tesla Drivers Need To Pay Attention


Recently, video surfaced of a Tesla
TSLA
plowing into a stationary object. In this case, it was an overturned truck blocking two full lanes of a highway in Taiwan. As usual the immediate speculation was that this was another failure of Tesla’s Autopilot system. At this point we don’t know if that was the case and frankly it doesn’t matter. Autopilot or not, Tesla’s vaunted machine vision failed to detect and respond to a large stationary object. 

Every Tesla built since 2016 has eight cameras, 12 short-range ultrasonic sensors and a single forward facing radar. Among the functions that these sensors are expected to enable are driver assist features like adaptive cruise control, lane keeping assist, blindspot monitoring and automatic emergency braking (AEB). 

There have now been multiple fatal accidents that involved the use of so-called Autopilot functions where the driver was not properly supervising the system taking control when it failed to properly respond. There have probably been many other crashes where the system disengaged itself shortly before but the driver wasn’t aware of it and assumed it was on. 

None of that matters in this particular crash. Sure if Autopilot was on, it’s another in the string of failures that prove Tesla is nowhere near ready to deploy full self-driving capabilities. However, even if nothing was on and the driver was at least nominally in control, the AEB should still have applied the brakes when the truck lying across the road was detected. 

The fact that AEB did not seem to respond in any way in this crash is a sign of fundamental failings in Tesla’s perception capability and how many drivers put undue trust in it. It’s not that the system doesn’t work at all. If that were the case no one would use any of Tesla’s driver assist features. Conversely, if the perception worked all the time, Tesla could indeed do full-self driving today. 

What we see here is the worst possible scenario, where it works often enough that drivers actually trust it and then fail to pay sufficient attention. Worse still, the manufacturer and particularly its outspoken CEO Elon Musk, continually promote its capabilities and promise of ever more capable self-driving. The problem is too much trust in technology that hasn’t earned it gets you into an uncanny valley where very bad things can happen.

Tesla owner instructions state that the driver is always responsible for control of the vehicle. But the presence of AEB on a vehicle that is claimed to be capable of so much more than lead to a false sense of security in even the AEB. This crash like so many others is ultimately the primary fault of the driver unless some other system failure is discovered. 

But this crash does again illustrate the dangers of putting faith in technology that has proven itself to be so consistently inconsistent. It also shows how far Tesla still has to go in developing a safe and robust automated driving system. You may have paid for full self-driving, but there is not a single vehicle available on the market today that can deliver on that promise from any manufacturer including Tesla.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.