Transportation

Waymo Operates Vacant But Makes Surprising Potential Mistake


A recent video was published by a driver who encountered one of Waymo’s minivans in Arizona operating entirely vacant. Waymo has been doing limited completely unmanned operations on a small scale for some time, but it’s been rare for people to encounter these in the wild. That Waymo approved even limited operations of this type was a monumental feat, because it says that Waymo’s own teams of engineers and lawyers agreed they had reached the point where this was safe to do – safe for the public, and safe for the company’s reputation. Safe enough to do in the Phoenix area, where Uber killed a pedestrian and public tolerance for any incident would be low. Safe enough to bet their multi-billion dollar project – valued by people like Morgan Stanley at over $100B – on. For a solid team to make that bet says a lot; it almost surely needed to be approved all the way up to the Alphabet board.

I named the decision to go unmanned one of the biggest milestones in robocar history back in March of 2018, not knowing that just a few days later, the Uber fatality would knock it down a peg.

And so far, these operations have been safe with no reported incidents. While the vehicles are unmanned, they are well connected back to Waymo HQ, where people in an operations center can look out the vehicle’s cameras and other sensors and help it if it encounters a problem it doesn’t understand. They don’t drive the car in real time, rather, they give it strategic advice about where to drive in a strange situation, and when to go – the car still makes the steering and braking decisions.

Unfortunately, in this video there is something odd, which begins around the 1:05 mark. The person making the video is excited and driving erratically and illegally. The Waymo van pulls up to a stop sign to make a left turn, and puts on its turn signal. The video car then pulls up next to it on the left in order to shoot the empty driver’s seat – and thus is driving the wrong way in the oncoming lane. The videographer also pauses at the stop sign, and the car then proceeds, with no major delay, to make its left turn, cutting right in front of the illegal driver. That driver then follows it and returns to the legal lane.

What the Waymo van did is not illegal, and nothing happened, because this wasn’t a crazy driver, but just an over-eager human making use of the whole road on a mostly empty street, something human drivers do all the time for various reasons. The Waymo van had the right-of-way and the video car was stopped. At the same time it is surprising that a robocar would not switch into a mode of great caution in the presence of a driver driving the wrong direction in a lane, particularly when it comes to cutting left right in front of that vehicle, even when it has just stopped. A human driver would actually be able to see the video camera and figure out what was going on here, and know it was probably safe, but I’m fairly confident the car isn’t that smart. My most likely conclusion is that the car did not become cautious because the other car was in the other lane, everybody was stopped, and there was no conflict of right-of-way.

It’s also possible that Waymo’s operations center was aware they had somebody trailing the van – this happens, and probably happens more with an unmanned van – and that they gave the order to go using human knowledge. I have reached out to Waymo for information on this and will update this story with any comment.

It is, of course, not entirely fair to add to a robocar’s burden the extra problem of human drivers acting strangely because the robocar is there. That’s not something most human drivers have to worry about. On the other hand, there has been criticism of Waymo and other robocars as to how they handle avoiding being in accidents that are not their fault. Waymo’s record has been exemplary – over 10 million miles of driving with only one at-fault accident attributed to the software. That’s far better than human drivers. (Of course, no human would drive 10 milion miles in a lifetime.) They have had several accidents where they were not at fault under the law, and some have challenged that robocars, in their quest for safety, should also be better than humans at preventing even those accidents, at least where it is possible to do so. Had the “crazy” videographer continued driving from the stop sign, there might have been a fender bender. (On the other hand, at these low speeds, the Waymo van might well have been able to brake hard and avoid that, depending on timing. An empty van has no fear of extremely hard braking.)

With all these factors, I don’t think this is some major incident for Waymo (even with the worst presumptions.) The main area of concern is that this was seen on one of the few videos ever released by a 3rd party of their unmanned operations. When something odd happens the first time, it naturally raises concern over how many odd things are going on, but anecdotes are not data.

It’s also worth noting that this unprotected left turn is a situation where Waymo has received a lot of criticism for being too conservative and pausing too long. One of the great challenges of such projects is finding the right, but still suitably low risk, balance on such decisions.

A frequent refrain from developers is that one of the big challenges in robocars right now is getting better at predicting what others on the street are going to do, a skill at which humans currently surpass robots in many ways. This includes predicting what law-breaking and strangely acting humans will do, and when to be cautious around them. If this is an error by the Waymo system, it will quickly enter their test suite, and a variety of versions of it will be created in their simulator, so they can tells all new versions of the software against drivers of this type, and do the right thing. Simulator is the only place to extensively test what vehicles will do when they encounter erratic or illegal driving, and Waymo boasts over 10 billion miles of simulated testing and growing. The right thing, by the way, may not be that different from what the van did – stop for 5 seconds and proceed with caution. Something not currently available to the robocar is what a human driver would have done – made eye contact with the illegal driver and come to some sort of non-verbal agreement about who was going. Robocar developers know this arrow is not in their quiver, though many alternatives have been explored, mostly involving car “body language.”



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.