Transportation

Yesterday’s Waymo Crash In Arizona Does Not Involve Self-driving, But Some Day It Will


On Jan 30, a crash was reported between a Waymo Chrysler Pacifica minivan and another vehicle, causing minor injuries to the safety driver. Reports suggest the other car acted erratically and cut quickly in front of the Waymo vehicle, causing it to rear-end the other car. According to Waymo, the vehicle was being driven manually, with autonomous systems off, on its way back to the Waymo depot in Chandler, AZ. Early reports that a passenger was on board were incorrect.

As a crash between two ordinary human driven cars, there is not too much notable about it, and fortunately injuries were light. The remaining interesting detail is the suggestion that the other car might have been trying to “play” with the Waymo vehicle, presuming that it was in self-driving mode, perhaps trying to see what it will do.

In the early days, many people would tell me when they first saw a robocar on the streets, they were tempted to interact with it and see what it would do. In reality, very few people ever did this, though some did. They did not get what they wanted – safety drivers are trained to take control in any sort of dangerous situation like that, so all they achieve is making illegal moves in front of a human driven car.

Earlier we saw a situation where a person trying to film an unmanned Waymo car did some strange driving around it. I criticized the car for proceeding as though nothing that strange was going on, though in the end all worked out. In time, as novelty wears off, this sort of activity should fade.

Even so, it should become the case soon that the cars reach a level where the safety driver actually need not disengage, and the system might actually do better, with its superior knowledge of physics, than a human driver, at avoiding any potential accident. It’s not easy though, because the car probably won’t want to swerve, with the risks that swerving produces to other cars on the road. It might swerve on an empty road but not an occupied one; cars will not want to be too erratic themselves. One of the big challenges today in robocars is getting good at predicting what other road users will do. I am not sure if much attention has been paid to trying to get good at predicting what a dickish driver trying to screw with you will do, but it’s not out of the question.

It is generally illegal to do something like this and I suspect Waymo will work to get a severe punishment to the other driver, to send a message not to do this.

The more interesting long-term issue will be the road users, from aggressive drivers to pedestrians, who start “mistreating” the robocars because they know they will always yield. The normal instinct for a company building something like this is to make it conservative and timid, yielding right-of-way rather than taking risks. It may well be possible and safe for a pedestrian to freely walk in front of a robocar with only a modest amount of clearance, sure that it will stop. Aggressive drivers will be pleased to drive with cars they can cut off with safety and impunity. This is not a problem yet, but it will arise in the future, and then solutions will need to be worked out.

Waymo has had other news recently, including the launch of a pilot project with UPS. In this project, Waymo vans (with safety driver) will take packages between a UPS depot and UPS stores. This way they don’t need to worry about delivering to customers, and there will be a UPS employee at both ends to handle the packages. While Waymo’s prime business thrust is the robotaxi market, they are making more overtures in the logistics market as well, which may be easier to exploit at an earlier date.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.