Transportation

Lessons Gleaned From The NTSB Report For A Tesla Autopilot-Engaged Car Crash


The NTSB has released its findings about a car crash that involved a Tesla Model S with its Autopilot-engaged that rammed into the back of a parked fire truck on a busy freeway in Southern California, occurring on a sunny morning of January 22, 2019.

NTSB investigators examined various collected evidence about the crash and interviewed the driver and an eyewitness, along with sorting through vehicle related data pertaining to the moments leading up to and at the moment of the incident.

In my initial overall review of the NTSB report, there are some important elements worthy of consideration as valuable lessons learned and I’ve detailed those points below.

Background About The Incident

As useful background about the incident, the Tesla was heading southbound on a busy freeway, the I-405, and came upon a fire truck that had responded to a prior incident involving a downed motorcyclist.

The fire truck had parked in the HOV lane to guard the area in front of the fire truck, allowing the fire department personnel and other first responders to aid in recovery of the motorcycle incident.

As is customary in such instances, the fire truck parked slightly askew, straddling the HOV lane, attempting to block traffic, and angling so that any upcoming traffic would presumably see that the fire truck was stationary and not be visually misled if the fire truck were instead placed at a straight ahead position (i.e., viewing the rear of the fire truck, if it were parked straight ahead, could mislead an upcoming driver to potentially assume that the fire truck was perhaps moving forward; while by instead being positioned askew, it is hoped that the arriving traffic will realize that the fire truck is parked and doing so in a traffic blocking manner).

The Tesla was in the HOV lane and coming up quickly toward the parked fire truck.

According to the driver of the Tesla, a somewhat large-sized vehicle such as a SUV was in front of the Tesla during the time leading up to the fire truck, and this other vehicle apparently blocked the view of being able to see the fire truck. At the last moment, the vehicle ahead of the Tesla opted to dart out of the HOV lane, and the Tesla proceeded to ram into the rear of the fire truck.

The Tesla driver survived the impact and was able to get out of the Tesla, stunned and yet able to walk where the incident scene was.

The Tesla had incurred a substantive amount of damage including a crushed front bumper, buckled hood, broken headlights, shatter windshield, and other related collision damages.  

The fire truck was struck at primarily the left rear area, which was the angled part of the fire truck closest to the straight-ahead impact by the Tesla. A relatively minor amount of damage was done to the fire truck, and fortunately none of the nearby firefighters were injured by the crash.

Let’s now unpack salient aspects of the NTSB Report.

Driver Familiarity With Automation Being Used

One important aspect about the use of car automation involves whether the human driver knows how to use the automation.

Today’s ADAS (Advanced Driver Assistance Systems) are becoming increasingly complex and there is concern that drivers might not properly employ the automation.

Consider these key aspects in this particular case.

Situation:

The driver of the Tesla had owned the car for about 6 months.

It was the first time that he had owned a Tesla, and previously had driven a Prius. He indicated that part of the reason he opted to purchase a Tesla was because of the Autopilot technology.

His daily commute to work consisted of taking the same path that he was undertaking on the day of the incident and he routinely used Autopilot during those trips.

Analysis:

This driver presumably knew enough about Autopilot to be familiar with it.

Had he just bought the car or if it was a rented car, he might have been in a posture of not knowing Autopilot and not known what its capabilities and limitations are. He routinely used Autopilot, and indeed his remarks during the interview with the NTSB reflected his general understanding about the Autopilot features.

Also, notably, he was driving in an area that he knew, rather than driving in say a new area that was unfamiliar to him.

Lessons:

The aspect that the driver was familiar with Autopilot provides both a positive and a negative in these kinds of situations.

Drivers can become lulled into thinking that the automation of the car will pretty much protect them and therefore they allow themselves to become complacent as a driver.

Over reliance on the automation is an easy trap to fall into. If you drive for many hours and hours, day after day, and the automation seems to be safely guiding your car, you can mistake that pattern to believe the automation is doing more than it really can.

Dovetail into the matter that the area being driven was familiar can provide a kind of double whammy, namely that the automation is in familiar territory and being used in a familiar way, all of which can cause a driver to let their guard down.

Sensors Aspects Are Crucial

The sensors built into the car and used for aiding the automation during the driving task are crucial to how the driving will be performed by the automated system.

An appropriate set of sensors, along with being varied so as to provide multiple perspectives is a prudent approach, plus the sensor data needs to be assessed by the on-board sensor fusion capability to make a cohesive whole out of the piecemeal raw data detection being performed by the sensors.

Situation:

The driver of the Tesla indicated that when he bought the car, he had it fully inspected by a Tesla dealership, wanting to make sure that the used Tesla was in good shape. There were some repairs made and upon doing so he was apparently told that the Tesla Model S was ready for use.

Subsequently, according to the interview, he had the cameras on the front of the Tesla replaced, apparently twice, along with having either a radar or sonar unit also replaced.

On the day of the incident, he was operating on the Tesla Hardware Version 1 and had last received a firmware OTA (Over-The-Air) update on December 28, 2017, just about 25 days prior to the crash.

He also indicated that in his experience of using Autopilot, when the car was heading directly toward the sun, the Autopilot would at times not seem to be able to gauge sufficiently objects ahead of the car. He likened this to a person that might be squinting to see what’s ahead when looking into the sun, and so he at times would turn-off the Autopilot and revert to purely manual control of the Tesla if the sunshine aspects seemed worrisome.

He indicated that the incident that morning did involve driving toward the sun.

Analysis:

We don’t know why the cameras were apparently replaced, and nor why the sonar or radar was replaced. Whether this might be a factor related to the incident is seemingly unknown.

That being said, there hasn’t been any stated indication in the NTSB Report that the Tesla itself was not functioning or that any of the sensors or automation systems were having any issues.

In terms of the Autopilot update, the aspect that it was recently undertaken implies that presumably the latest available version was on-board the vehicle, though the NTSB Report doesn’t clarify whether or not a more recent version might have been available (and if so, whether it might have any relationship to the incident).

Lessons:

Drivers tend to over time discover facets about their cars, such as knowing that a car can handle left turns well but maybe gets sticky on right turns.

The same kind of driver discovery happens with the automation being used on a car, namely that the human driver gradually identifies quirks and nuances about the automation.

In this case, the driver said that he had discovered that sunlight shining directly at the front of the Tesla was able to at times create difficulties for the Autopilot.

There is a potential that the cameras might not be able to grab clear images or be otherwise somewhat confounded by a flood of sunshine, plus the captured images might not be readily interpreted by the Machine Learning system that is employed.

This also highlights the importance of having a multiple set of sensors and different types of sensors, such that if one particular sensor or a set of a particular kind of sensors are stymied, other sensors might make-up for the difficulty.

During sensor fusion, the data of the several sensors are pulled together in real-time and analyzed by the automation, trying to figure out the true nature of the driving scene, including whether to ignore sensory data that might be unreliable and opt to provide greater weight to other sensory data.

If perchance the cameras were hindered by the sunlight, what did the radar detect and how did the sensor fusion deliberate between the camera images interpretation and the radar data interpretation?

Did the angle of the parked fire truck potentially disturb the radar detection, maybe reducing the certainty of the radar return?

If there had been a LIDAR capability, a type of sensory device eschewed by Elon Musk and not included in Tesla’s sensory suite, would it have aided in potentially detecting the fire truck within the final seconds remaining prior to impact?

LIDAR units are oftentimes placed on the top of a vehicle to gain a 360-degree perspective, and if so, would a sensory vantage point from the rooftop of the car had a heightened chance of detecting the parked fire truck sooner than the other sensory devices at lower points on the vehicle body?

Data About The Crash

You likely know that airplanes contain a so-called “black box” that collects vital data about a plane, doing so as a means to allow an after-crash analysis of what was occurring in the plane’s systems at the time of the crash.

Referred to more formally as Event Data Recorders (EDR), we take it for granted that airplanes make use of them.

In the United States, the use of EDR’s on cars is a voluntary activity by the car manufacturers and there is no federally mandated legal requirement that an EDR must be included into a normal car. Some automakers include an EDR, some do not.

There is an ongoing debate about whether or not an EDR should be a mandatory piece of equipment on cars.

Situation:

The Tesla did not have an Event Data Recorder, which as mentioned above is not a requirement.

In lieu of examining the data of an EDR post-crash, the NTSB sought out Tesla to provide data to the NTSB, which per the NTSB Report “Tesla wirelessly downloaded the recorder/Autopilot data from the crash involved vehicle post-crash; this data was provided to NTSB investigators.”

The data indicated that the Tesla’s ignition cycle had started about 66 minutes before the crash, meaning that the car had been turned-on for roughly 66 minutes prior to the crash.

Furthermore, the data showed that the Tesla was following a vehicle ahead of it, prior to the crash, and the system was adjusting the speed of the Tesla to maintain a relatively constant time-based distance to the “lead” vehicle.

According to the data, with approximately 4 to 7 seconds left to go before the crash occurred, the Tesla was “traveling at a consistent speed of about 21 mph, the lead vehicle was slowing.” And, that the “lead vehicle changed lanes at 3-4 seconds before the crash.”

I’ll come back to this data in a moment.

Analysis:

The data seems to match to the driver’s remarks that he was following another vehicle while approaching the parked fire truck.

Plus, the data seems to support the indication by the Tesla driver that the vehicle ahead darted out of the lane just as the parked fire truck was a few seconds away.

Lessons:

The Tesla driver said that he didn’t see the fire truck, presumably because his view was blocked by the vehicle ahead of him.

One of the potential aspects about using a fully rigged automated car that has a full suite of sensors is the possibility that it might be able to detect objects ahead that the human driver might not see. This capability obviously depends on the types of sensors used, along with where the sensors are placed onto the car.

This is not an easy matter, though, and it is possible that even the best of sensors might be blocked or obstructed.

In terms of the debate about having an EDR, some proponents argue that without an EDR, the post-crash investigation becomes overly dependent upon the car manufacturer and would presumably be reliant upon the car manufacturer to try and retrieve any data and provide the data in whole or in part to the investigators.

Detection And Reaction By The Automation

A human driver that becomes expectant of the car automation to take vital steps during a traffic exigency is assuming that the automation will appropriately detect a dire driving situation and then the automation will correspondingly take needed corrective action.

Situation:

Per the NTSB Report, once the vehicle ahead of the Tesla abruptly changed lanes, the data indicates that the “following distance of 33 meters at 4.1 seconds before the crash dramatically increased to 120 meters one second later; the 120 meter is a default value indicating that the system has not detected a vehicle in front.”

Furthermore, per the Report: “As the system no longer detected a lead vehicle 3-4 seconds before the crash, Autopilot started accelerating the Tesla toward the TACC-set cruise speed of 80 mph which the driver set nearly 5 minutes before the crash.

And, the Report also states that: “At the time of the impact, the Tesla was traveling at the speed of 30.9 mph.”

Finally, here’s an essential and telling aspect too: “Data shows that about 490 msec before the crash, the system detected a stationary object in the path of the Tesla. At that time, the forward collision warning was activated; the system presented a visual and auditory warning.”

The NTSB Report also states that the Automatic Emergency Braking (AEB) system did not engage and nor did the human driver engage the brakes at the time of the incident.

Analysis:

It would seem that once the lead vehicle got out of the lane, presumably at that juncture there might have been a chance to detect the parked fire truck, assuming that it wasn’t somehow feasible to have earlier detected it.

Perhaps there were three to four seconds or less that might have been available, according to the reported data, though it is difficult to say. It appears to be the case that with about a half-second left to go, the system did presumably detect the fire truck.

It is problematic to try and play with those few seconds in-absence of knowing more about how the automation is structured and designed, since there is time needed for the sensors to collect data, there is time needed for the sensor fusion to assess the data, there is time needed for the computer processors to figure out what it means and what to do next, etc.

Interestingly, the forward collision warning was apparently issued to alert the driver (with a half-second left to go), though the AEB wasn’t activated.

Lessons:

Automation for cars can be a complex intertwining set of somewhat disparate components, each of which has their own particular functionality, and yet overall those components in-the-end need to work as a tightly organized cohesive whole.

The timing of how those components work can be crucial to the driving task.

The Driver

For the automakers that are producing Level 2 and Level 3 cars, they are apt to indicate that the ultimate responsibility for the driving of the car is the human driver.

Thus, essentially no matter what the automation does or does not do, the argument made is that it was on the shoulders of the human driver and whatever happens in a car accident rests with the driver, not with the automation.

Situation:

The Tesla driver indicated that he had his left hand on his knee and that his fingers were lightly touching the steering wheel, and his right hand was free, though he also indicated that he might have had a coffee cup in his right hand.

According to the NTSB Report: “The system detected driver’s hands on the steering wheel for only 78 seconds out of 29 minutes and 4 seconds during which the Autopilot was active.”

The driver indicated that he was not using his cell phone at the time of the crash.

An eyewitness in a car that was nearby the Tesla claimed that the Tesla driver was looking persistently downward, as though looking at some device in their hand, and the eyewitness said that the Tesla driver did not look-up at all during the final moments leading to the impact with the fire truck.

Analysis:

Was the Tesla driver paying attention to the roadway and simply got caught unawares by the sudden lane change of the car ahead?

Or, was the Tesla driver not paying attention, perhaps believing that the automation was doing just fine, and allowed himself to become distracted by something else inside the car, such as the coffee or a cell phone?

Lessons:

There is a growing concern that drivers in Level 2 and Level 3 cars might readily allow themselves to become distracted from the driving task, and as such, some automakers are adding more and more tech to try and ascertain whether a driver is remaining alert to the driving effort.

Conclusion

For those that are concerned about the name Autopilot, which has stirred controversy as a naming of automation, for which could be construed as misleadingly suggesting the automation is able to perform in a true fully autonomous manner, the Tesla driver remarked during his interview that he felt that the name was not accurate since the vehicle does not fully drive itself.

In addition, there are some that are concerned that drivers that are first stepping behind the steering wheel of these partially automated cars will not properly understand what the car does and what they need to do as a driver.

Oftentimes, the automaker will assert that the owner’s manual lays out the operations of the vehicle and therefore the driver of such a car is informed as such via the owner’s manual operational instructions.

The Tesla driver said that he had an owner’s manual but had never looked at it, and instead when he bought the Tesla he had a salesman take a few minutes to show him how to operate the Tesla. Whether having a salesman quickly show someone the full depth of features and limitations of a Level 2 or Level 3 car is sufficient for proper operation of an ADAS equipped car is open to debate.

Overall, for each of these instances of any car crashes involving a Level 2 or Level 3 car, it is important and useful to consider lessons that can be gleaned, hopefully providing insightful guidance to the further refinement of ADAS and also for the pursuit of true self-driving cars.



READ NEWS SOURCE

Also Read  Tesla Autopilot's Enhanced Summon Is A Nice Concept That Opens A Can Of Worms