Transportation

Uber’s 2018 Pedestrian Fatality Proves Once Again That Humans Are Dangerous


The United States government has spoken: self-driving software did not kill the pedestrian hit by a developmental Uber self-driving car (SDC) in Tempe, Arizona last year. Primary blame is placed on the “safety driver” who was not doing their job of monitoring and responding to the driving environment. Those of us who have closely studied the initial details of the crash came to the same conclusion. It was pretty obvious, as the safety driver was reportedly watching TV on her phone and did not respond properly as the crash situation developed.

The purpose of a safety driver is to take full responsibility for safety.  Developmental self-driving software which is being tested is by definition not to be trusted.  Developers can only assess and improve their software by seeing when and how it fails. While testing SDCs on public roads has attracted some controversy, I assert it is a valid path towards a much safer future for all of us.

The “voice of the government” in this case is the National Transportation Safety Board, an independent investigative agency responsible for civil transportation accident investigation. Their investigation started immediately after the crash and involved intensive interactions with Uber, which NTSB praised for their full cooperation. It was an extensive and wide-ranging endeavor, as thoroughly summarized by my Forbes colleague Brad Templeton.

To date, this tragic Uber event is the only case in which human harm occurred involving a developmental SDC as the striking vehicle.  Looking broadly at the automated driving space, other fatalities have occurred in production Tesla vehicles in which AutoPilot was activated.

It might be tempting to say that self-driving software is immature and dangerous; indeed many pundits have.  But let’s take a deeper look. The victims in Tesla fatalities so far have been the Tesla drivers, who unfortunately chose to ignore instructions from Tesla and the vehicle itself to pay attention to the road and keep hands on the wheel. These Tesla drivers had full responsibility for operating safely in traffic and responding to any crash-imminent situation. They chose not to fulfill their responsibility. The Uber safety driver in the Tempe crash had full safety responsibility and chose not to fulfill her responsibility, either.

The fact is that no human harm has been caused by self-driving software thus far. Let’s hope this continues to be the case.

The major companies developing these systems aim to deploy vehicles which operate at a safety level far greater than human drivers.  The more mature automated driving startups appear to have a strong safety culture. This culture was already in place for most and others were likely “scared straight” by the Uber crash. Uber’s safety culture at the time of the crash was strongly criticized in the NTSB report; they have since made a huge turnaround and are now leading voices in safety practices for SDC development.

Still, much can go wrong.  New startups continue to pop up. Every new player creates an unknown as to how responsible they will be in conducting on-road testing.  While this alone is enough to keep me up at night, what most worries me is open source software such as openpilot from comma.ai that enables anyone with a little tech sense to take a regular vehicle and rig it up to function with some degree of self-driving capability. This “hobbyist” community coming into the game creates a huge risk of non-robust self-driving systems coupled with non-attentive car owners showing off their toy to friends on Instagram. Like Tesla drivers thus far, maybe the hobbyists will kill only themselves. I’m more concerned about them causing harm to others, in the same way I’m concerned about irresponsible Tesla drivers causing others harm.

NTSB is an investigative agency whose conclusions are highly respected. But once an investigation is done, NTSB has only a soapbox to seek to effect change. Unlike the regulatory agencies, there are no consequences to ignoring the NTSB. Nevertheless, their voice is definitive and respected. Based on the specifics of this crash, the NTSB report made safety recommendations to the National Highway Traffic Safety Administration, with NHTSA responding that “it welcomed NTSB’s report and will carefully review it.”  NTSB has been seeking regulatory action requiring more active safety equipment on cars and heavy trucks for years, but the regulatory philosophy in the U.S. has leaned towards a “hands off” approach for many years. Self-certification is the rule. Maybe new measures will be put in place, but the challenge for the regulator is to define exactly what to put in place and how to ensure compliance within the statutory boundaries established by Congress.

The tragic death of Elaine Herzberg in Tempe was caused by the same thing that killed over 6000 other pedestrians last year – a human.  We’ve lived with this kind of reality our entire lives.; it’s amazing what we can get used to.   The crash avoidance systems now on most new cars, coupled with the near-term introduction of self-driving vehicles, are moving us in a new and better direction. 



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.