Transportation

Uber Robocar Safety Driver Changed With Negligent Homicide


In March of 2018, a self-driving Uber
UBER
struck and killed a homeless woman walking her bicycle across an isolated road in Tempe, AZ. The collision rockedthe self-driving world, but remains its only fatality. I’ve extensively covered this crash before, and most matters were settled after the NTSB investigation of the crash.

One key factor remained, would there be criminal liability for the safety driver, the woman responsible for supervising the self-driving system and taking control if it makes a mistake? The answer is yes, and with an indictment from a grand jury, the county attorney filed charges for negligent homicide today, a class 4 felony that can bring a sentence of 4-8 years. Few further details or comments were given by the attorney or Uber or the defendant, who plead not guilty.

She was charged because her job was to operate a prototype vehicle known to need interventions, yet, as police allege, she was instead watching the streaming show “The Voice” on her phone, rarely looking at the road. It would be extremely clear how negligent that was in an ordinary car, and so it is not surprising to see her charged. The remaining issue is to what extent she decided to do this because she trusted the technology too much.

We talk about that a fair bit with systems like Tesla Autopilot and GM Super Cruise, which ask drivers to monitor the road and (for Tesla) keep hands on the wheel. The term “automation complacency” has been coined to refer to the fact that the better automation gets, the more it encourages humans who are supervising it to slack at their job.

That is a real question for things like Autopilot. It’s less of one for Vasquez, the driver. First, this was her job. She was trained badly, for which Uber has taken some blame, and she had no partner, which was another mistake on Uber’s part — a partner can eliminate the chance of complacency and also watch the road with extra eyes.

Most of all, though, Uber’s car wasn’t very good. Reports suggest at that time it was needing an intervention quite frequently, perhaps ever 23 miles. Yes, that might only be once or twice an hour in a place like Tempe, but it’s often enough that it should have been clear it was not OK to watch a video. If it had gotten to the level that companies like Waymo are at, needing interventions every tens of thousands of miles, you can certainly see the risk of complacency.

As such, the trial may not go well for Vasquez. And we can hope this will be the only fatality in testing. In spite of this incident, the record of the other, top quality teams of testing with safety drivers is exemplary. Waymo has gone tens of millions of miles with only one minor at-fault accident — a record far superior to a typical human driver, who only drives about half a million miles in their life. At-fault accidents of any kind, let alone fatalities are extremely rare.

This doesn’t mean that other safety drivers won’t make mistakes. They will, though the chance of an accident due to those mistakes gets lower and lower as a team gets more mature, if the complacency problem is dealt with. This was no mistake — and the charges are reasonable. The verdict will perhaps close out this story. Until the next incident — there will be one — but hopefully with less negligence involved.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.