Transportation

Waymo Peforms Embarrassingly In Construction Cone Situation


A recent Waymo ride in Chandler Arizona, recorded by Youtube recorder JJRicks (who has made a side profession of showing video of Waymo rides and is a fan) knocks Waymo’s otherwise excellent progress down a few pegs. The vehicle can’t handle some road lanes that have been marked off with traffic cones, but that’s not the issue. Those sorts of errors are to be expected, but rare, and they seem to be rare for Waymo.

Another rare failure was a remote operator giving wrong instructions to help the car out of its problem. Again, that’s going to happen — ideally rarely.

The real failure was being unable to handle the failure gracefully. Robocars will have situations like this for a long time to come, and while you want to eliminate them, it’s even more important you fail gracefully. Waymo does not, and the result is a comedy of errors.

The video contains useful links to important sections, and the important section starts 11 minutes in. Workers have coned off the right lane of a road. Only at intersections do cones make it clear which lane is closed, otherwise you just see two lanes with occasional cones on the lane line. Waymo vehicles are good at detecting cones.

In the video, the car becomes confused to the point of stalling for multi-minute periods in the middle of the street. In such cases, a remote operator is summoned to examine the situation as seen from the many cameras, and give the car a new path to get out. But for whatever reason, this doesn’t work. Waymo states that at this point their remote ops team made a mistake in how they instructed the car, but they don’t detail what the mistake was. Whatever they did, the car waits for a long time before moving and gets more confused once it moves. It stalls again, in the middle of the cones between two lanes. (In addition, the Waymo takes a strange detour before it gets confused, so possibly the wrong instructions came earlier.)

Also Read  US Businesses Are Benefiting From Ambitious Environmental Goals

At this stage, the problem is escalated. Waymo maintains crews of 2 in vans around their service area. In such a situation, one of these vans is dispatched to rescue the car. A driver will get out, get in the car and drive the car manually to resolve the situation. At least that’s what’s supposed to happen. This time, it happens very, very badly.

Waymo, in their statement admits fault, but states that the vehicle did not do anything unsafe. I would not agree and am surprised at this statement. For long periods, their vehicle stops, blocking traffic either half into the lane or fully into it, causing all other cars to drive around it, which means entering the construction zone or the oncoming lane, which is not a safe situation, though human drivers can usually handle it. In addition, it is necessary for the rescue driver to attempt to walk in traffic and enter the car in this situation, which is not tremendously safe (though the rescue driver waits for a clear enough break in traffic to make it safe.)

As an added twist, the construction has ended, and a work crew comes to remove the cones in one section while the car is stalled among them. That is an unusual situation but doesn’t affect things greatly. The car doesn’t really understand this has happened.

The most embarrassing thing is the way the arrival of the rescue driver is handled. Three times the car suddenly decides it can figure its way out and it starts driving, only to get stuck again. Worse, it does this twice just as the rescue driver has arrived and is preparing to enter the car. At a certain point, when the rescue driver is about to enter the car, a signal comes which freezes the car and unlocks the door, but until that signal the car is still free to run away, and does so twice. Most of the situation would have been avoided had the car been frozen when the rescue driver was within a minute of entering it. But it’s funny, in a cringe sort of way.

As the crews remove the cones, they ask the passenger what’s up, and he tells them the rescue crew are not far. But it doesn’t end. Indeed, once the car decides it has solved the problem and starts moving — even a short distance — it cancels the rescue order, and immediately has to restart it when it gets stuck again by the same confusing zone.

Mistake #1 – not handling the cone zone

As noted, this is the sort of mistake you will expect, and the real issue is how you deal with the mistake. Even so, this is a surprising mistake because this sort of cone situation is hardly unusual. I would have expected Waymo to have tested this situation often on their test track and in simulation. (No fear, I am sure it is now being put in their simulation suite and all variations of it will be tested from now on.)

In fact, I would have expected them to have tested much stranger variations of this in simulation long ago, so this is disappointing. It gets quite confused by the cones, and does not seem to have learned how to tell which of the cone-divided lanes is the one to drive in. Perhaps the ops team told it the cones were not there or the road was only one lane? It’s confused at the idea of a zone being terminated by workers removing the cones. It may want to wait for human confirmation, but doesn’t seem to get it.

Waymo has decided to not allow its remote operators to remote drive the car. They can only give it planning advice. While this decision is understandable — network latency can certainly make remote driving risky, particularly at speed, in regions of poor network quality. However, there should be exceptions for rare situations like this, as the risk of remote driving is less than that of what happened here. I am also unsure why the remote operator’s ability to give new paths to the vehicle could not have gotten it to a safe spot, not blocking traffic. The “wrong instructions” must have had long-reaching implications.

Mistake #2 — inability to get to a “safe spot”

In a situation where a car stalls due to confusion, its goal should be to find a “safe spot” where it can wait to resolve the problem, rather than blocking traffic. The Waymo makes several poor choices here, first waiting in the middle of the feeder street, then stopping on the lines between the two lanes (!) and later entering what is technically still a construction zone (though the zone is ending.) The latter is the better choice, better to block the construction zone than block traffic, if a human determines the zone is not unsafe to enter — and it is definitely safe to enter in this case. (It is possible that eventually Waymo did rely on human operator assistance in moving to this safe about-to-end construction zone, but if so it took a lot of time. It is also possible all these strange choices were based on seriously wrong instructions which were never corrected even though a Waymo remote operator was watching the whole time and talking to the passenger.)

The truth is that there was nothing about this situation that should not have been fixable by a remote operator, even after the prior mistake. Presumably the hard thing was for the remote operator to understand the situation without having clearly seen the new lane geometry, though that should have been clear the first place the vehicle was stuck, though without that history you won’t understand it later. This is a flaw many robocar teams have — they focus almost entirely on understanding the present and trying to make predictions about the future based on the present and very recent past. Here I presume the car is not taking into consideration historical information, such as what the cones looked like at the intersection, or which of the two lanes others have been driving in.

Mistake #3 — Inability to use the licensed driver

When things get this bad, a car should not ignore the fact that there is somebody in the car — the passenger — who does understand the situation and is present in it. It makes sense to be able to get input from that person, if they are above the age of 10. And if they are a licensed driver, it should even be a last-ditch option to let them resolve the problem by taking control. Liability lawyers probably go nuts at the idea of the passenger taking the wheel. But if the passenger is a licensed and vetted driver, it should be a fallback plan for situations like this. Of course, that comes with the risk of a person unbelting and moving to the front, or getting out of the car in traffic. In future cars with no steering wheel there are still options, including a compartment with a video game controller, or even low speed driving with a touchscreen — it’s actually something most people can do, particularly with safety systems forbidding hitting things.

While this should be very rare, it should be possible, because it looks particularly stupid when a clear and easy solution to the problem is sitting right there and you can’t make use of it.

Of course, if there is no passenger you can’t do this. But any passenger can and would offer guidance. In this case, one problem is the remote operators did not have the history in their minds and would need to review logs of video to have it. The passenger knew exactly what was going on, a closed lane, and could have given that information. It should have been used.

Mistake #4 — bad system for summoning rescue drivers

The funniest errors occur here. There are many ways this could improve. JJRicks keeps asking Waymo just how many rescue teams they have, and if one tracks every car. They don’t, but during any stage of deployment it makes sense to have several and have them closer. When errors are frequent, you want them ready, and when errors are rare, it doesn’t cost as much because you need them less. If there is even a hint of a stall-level problem, the rescue team can be dispatched immediately. If the car or ops team solve the problem, they can still keep going until it is confirmed fully solved. Then, and only then can they stop and wait for the next call.

This is expensive in the early phase when errors are being worked out. But it’s fine that it’s expensive then. You work to make it cheap by having fewer problems.

In this case, the fact that the car concluded it had solved the problem was not a reason to call off the rescue team. (Even in the case where it looked it had really solved it and started driving normal speed for a while.)

In particular, with a situation that is clearly not a one-off, like a long construction zone, it was foolish that they didn’t just lock the car as soon as the rescue team was within a couple of minutes. The reality is, if there’s a problem, letting the car keep poking at it won’t do the job.

To sum up the hindsight advice:

  1. When a problem can’t be resolved by the car, dispatch the rescue team immediately even before you figure it out.
  2. If the remote operations team decides it’s a one-time problem they can solve, solve it and call off the team, in that order.
  3. If instructions from a remote operator are causing more problems, they should be easy to isolate and reverse with review from other remote operators.
  4. If the problem looks like it could recur, never call off the team.
  5. Have the remote ops team get the car to a safe space, using an escalating path of small steps, actual remote driving, passenger-assisted driving and if need be, passenger-behind-the-wheels driving. Plan the safe space so there is a safe way for the rescue driver to enter the vehicle without having to walk in traffic if possible.
  6. Lock the car from further moves without approval until the rescue team arrives.

I’m a bit surprised that Waymo needs this advice. Problems like those seen in the video are clearly not common with Waymo, but what it shows is that when they happen, they need to do more to be ready. This will be particularly true when they open up in San Francisco (where they are driving more these days.)

At the same time, Waymo is alone in the USA to let random members of the public ride in their vehicles and make videos. They are taking the risk of exposing warts like these, and others are not at that level yet.



READ NEWS SOURCE