Transportation

NTSB Report On Tesla Autopilot Fatality Comes Down Hard On Tesla, Driver Monitoring And Distraction


Feb 25, the National Transportation Safety Board had its hearing on the 2018 fatal crash of a Tesla Model X , which steered itself into the crash barrier at an 0ff-ramp in Silicon Valley while Autopilot was engaged. Earlier, I issued a report on NTSB’s pre-release docket of facts from the crash. Refer to that report for most of the technical details behind the crash; there will only be a short summary here.

While many of the findings match the source documents, their hearing statement was much stronger and bolder than past recommendations, and came down on both Tesla — for not detecting the crash barrier, for not doing enough to limit use of Autopilot only to highways and for not detecting distracted driving well enough — and NHTSA for not forcing car companies to do this. While products like Tesla Autopilot were developed and deployed in the absence of regulation (like all other passive and active auto safety functions) they now call for a firmer hand.

Even more dramatically, the NTSB calls for emergency braking and collision warning systems to achieve a much higher standard. In the past, these systems were good if they caught just some of your collisions and prevented a few — now, it seems they say, they should prevent all common types of collisions. (In fact, to get this level of performance might be easiest with LIDAR, which Tesla has sworn off using.)

The NTSB is not a regulator. They investigate accidents, determine causes, and make recommendations for how to improve transportation safety. Instead of writing rules, they offer criticism of existing rules and practices. And they did indeed come down hard on Tesla, NHTSA (their sister agency which does write safety regulations,) the highway authorities and even, in a surprise addition, on Apple which makes the iPhone that the deceased driver – an Apple employee – was using and which contributed to driver distraction.

While a firm conclusion is difficult, NTSB’s report suggests it is highly likely that the driver was playing a strategy game on his phone, and there was evidence he had a habit of doing so. It is unclear if he was playing it just before the crash, but this seems likely. He was doing so, it is also suspected, because he over-trusted Autopilot and expected it to drive his car like a self-driving car. (As noted earlier, however, the driver had at least twice before had Autopilot make the same mistake at this off ramp, and steer out of the lane towards the barrier, but the driver had intervened and returned to his lane in time. From one viewpoint, that makes the system seem more defective, but from another, it makes it seem odd that the driver still trusted the system enough to play a game while driving right past that troublesome off-ramp.)

You can read all their findings and recommendations in their hearing report.

Distraction

NTSB began with driver distraction, but has a mission to understand why there is so much distraction and what can be done to mitigate it. As such, they criticized Apple for not having sufficient policies against cell phone use while driving, and not having countermeasures in place to stop drivers (particularly those using a company-provided phone, but really all of them) from using distracting apps. (Some phones do have a function to limit some activities while driving, but usually such limits can be turned off because passengers should not be subject to them.)

Tesla has been criticized in the past, and was slammed again for not doing more to detect when drivers are distracted and to alert them or forbid them use of Autopilot. NTSB also thinks Tesla should limit Autopilot to a much smaller set of roads and traffic conditions. Tesla’s manuals tell drivers to use it only on highways, but it does not stop you from using it in other places. This complaint doesn’t really apply here, in that this was on a major highway.

NTSB thinks NHTSA should insist on these location restrictions, as well as insist on better driver monitoring.

Monitoring

Tesla’s driver monitoring is fairly basic. They look for torque forces on the wheel, and if they or other signals are detected, it knows hands are on the wheel. It does not, as the NTSB has repeatedly erroneously stated, have a way to detect that hands are off the wheel. It is normal for Tesla drivers to have hands on the wheel but not applying torque and thus not detected as on the wheel. Tesla decreased the amount of time you could not torque the wheel before getting a warning, but the NTSB wants more improvement than that.

Because Huang, the driver, was allegedly playing this game on his phone at the same time he was torquing the wheel with enough regularity to avoid the warnings, it seems likely he had trained himself to hold the wheel and his phone at the same time, and semi-unconsciously tweak it to avoid that warnings. Some drivers have gone further, putting weights on the wheel to apply torque force — though they are more obviously abusing the system. Some cars go much further, and have cameras which look at the driver’s eyes to assure attention is being paid to the road. Tesla does have a camera looking into the car in its latest models, but as yet does not use it.

Better Emergency Braking

This report saw a much stronger recommendation for the improvement of collision warning and emergency braking systems. Currently, none of these systems are anywhere near perfect. This has been tolerated because as a new technology, they were likely much better than not having the system. If the emergency braking didn’t work, you were just back to relying on the driver to brake.

As systems get better, they breed complacency, though. People who trust their collision warning and braking systems can be more tempted to look away from the road for longer — and the systems do indeed make looking away less dangerous though they don’t make it a safe thing to do. Now the NTSB suggests that such systems “must be able to effectively detect potential hazards.” The word must is strong, for it suggests they are demanding perfection. Such a demand early on would have prevented the deployment of these systems in the first place, but perhaps now they can be held to a higher standard — though clearly perfection is not a standard they can be held to.

Other

They also put some blame on the highway operator for not repairing the crumple barrier fast enough. It was smashed by an earlier accident and repairs were delayed. A restored crumple barrier might have avoided the fatality.

They also want carmakers to make it easier for them to get all the data out of cars after an accident. While one can understand their desire for this, and it would indeed be a benefit to the investigators, this seems like a modestly selfish demand, and there should be care taken that they don’t hamper innovation as carmakers have to make their fancy new systems comply with old standards.

This recommendation is strange in a Tesla case, since Tesla probably records more data than any other car out there, but they want the others to catch up and make it easy for them to read and interpret that data.

Who to blame?

The findings assign causes (not strictly blame) to the accident. They do state that the Autopilot caused the crash by driving out of its lane and not detecting the barrier, but they also say it was not designed to detect the barrier. As such it performed as designed (even though they wish it were designed better) so they then tell the NHTSA that it should be demanding designs be required to detect things like this barrier. As noted, that’s something LIDAR is very good at while vision has trouble, which may add some fuel to the famous LIDAR vs. vision battle around Tesla.

The report finds that electronic lock-outs on phones is an effective countermeasure. I am skeptical, since you need a feature to disable lock-out for passengers.

They now explicitly declare that Tesla’s torque monitoring system is insufficient as a form of driver monitoring. Tesla has 90 days to respond. This is not the first time this criticism has been raised. Now NTSB is recommended that just using torque be forbidden.

In a nastygram to Tesla (which was kicked out of this investigation) they delivered a finding effectively saying that if Tesla doesn’t start limiting Autopilot to places it is designed to run, more people will die. (This is odd because in this case, the car was driving in a place Autopilot is designed to operate. In fact, as the second closest highway to Tesla HQ, it was probably one of the earliest places it was designed to run.) However, NTSB is also investigating a Florida fatality involving Autopilot being used on a non-freeway highway where cross traffic caused the fatality, which mirrored another fatality where the same thing happened 2 years prior.

They have bumped the call to the NHTSA regulators, saying it is now “essential” that they deal with misuse of automation, and call NHTSA’s more lax current approach “misguided.” They want NHTSA to do a study of driver misuse and use off the “official” roads for which it is rated. They want the NCAP “5-star” tests to now make sure collision avoidance systems can detect barriers and cross-traffic and unusual vehicles, all sources of recent Tesla crashes. They want NHTSA to determine if Tesla’s approach creates an unreasonable risk to safety. And they want standards for driver monitoring, and require this for all new vehicles with autopilot style systems.

Overall, they are much harder on Tesla in assessing the cause than I expected, and while they do attribute the accident to the driver’s distraction, playing a game and over-reliance on Autopilot, they realize that this driver can’t benefit from their recommendations, and others are not that likely to pay attention either. As such, to my surprise, they make no further mention of the fact the driver had seen Autopilot fail before at this off-ramp, and even allegedly complained about it.

Is this recommendation correct?

When it comes to driver assist, you certainly can’t demand systems able to brake for everything. That level of performance is what robocars are trying to attain. That doesn’t mean they can’t be asked to do better, and spot more things, and stop driving into the sides of trucks or backs of emergency vehicles. The recommendations could be clearer that it’s not a call for perfection, and that it’s still reasonable to make systems which expect and sometimes need driver supervision.

It is not out of the question that driver assist autopilots are a mistake in general, that they will always be misused, and thus should not be sold. I don’t believe this is true, and Tesla will of course deny this too. They will put forward the numbers they publish quarterly on “accidents” when Autopilot is used and not used. Sadly, these numbers are written in a way that is very suspicious and Tesla should clarify them so that the world can see if their conclusion that you are safer with autopilot than without is true. So far, they have declined repeated requests for clarification.

If drivers are indeed safer with autopilot than without, then it should not be heavily regulated. If they aren’t safer, then some regulations may be needed to move that needle. The confusion comes over who it’s safer for. Tesla’s numbers suggest that attentive drivers are safer with Autopilot, and that makes sense, because you now have two “brains” examining the road for problems and combining together to handle them better. At the same time, obviously people who abuse Autopilot are not safer than they were before, or it would be at Robocar levels of reliability. It is possible that if you make 95% of drivers safer and 5% of drivers who abuse it are less safe, that overall it has increased safety and should not be banned. Real data is needed to answer those numbers.

The recommendation that phones lock drivers out of distracting functions seems a lost cause. People will lie to get around it, and it’s sure to cause a great deal of frustration. The recommendation as written asks to lock out all devices when they are in motion, which clearly would face a major revolt from every transit passenger, Uber passenger or regular car passenger. The same is true of a requirement that companies write better safety policies for their employees who are driving. They are not going to be obeyed.

While it was reasonable for Tesla to have simple driver monitoring at the start, it may be reasonable to demand they improve it. While the recommendations apply only to new cars, Tesla, being Tesla, may well be able to apply them as well to their old cars. Tesla has resisted being paternalistic to its customers. This resistance has been the norm among automakers for decades — from the days of the first cruise controls, they were not designed to refuse to let you set them over the speed limit, and they would be rejected if they were.

I feel NTSB is too hard on NHTSA for not regulating enough. The history of automative safety tech — including seat belts, crumple zones, airbags, anti-lock brakes, anti-skid, collision warning, emergency braking and many others — is one of innovations created by companies and deployed unrelated in the market for years to decades before regulations came on board, and in most cases those regulations were of the form, “This is really good, everybody should do it” and much later “here is the level you should attain.” It’s perfectly normal to start unregulated and learn by practice what works and what needs mandating.

Tesla has taken the approach of telling its customers what they rate the system as being able to do, but leaving it up to the customers to control how they use it. They know that some customers will break those rules, and it’s a higher level philosophical question as to how much control the government or vendors should have over how people use their products. The one important caveat here is that a car driver abusing such systems can harm people other than themselves. The driver in this accident also smashed 3 other cars, but fortunately those occupants survived.

The gauntlet, though, has been thrown down. Another gauntlet will probably come when hearings are held on the 2019 Florida fatality involving cross-traffic. Tesla could ignore it. They could implement camera based driver monitoring, though they have resisted that. They will try to improve their detection of obstacles and emergency braking without any push from regulators — they can’t get anywhere on their so-called “full” self-driving product without doing that anyway. Everybody else will feel the same.

Your move, Tesla.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.