Transportation

Another Alleged Tesla Autopilot Failure Raises Questions On Tesla Training System


FILE – In this March 23, 2018, file photo provided by KTVU, emergency personnel work at the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View, Calif.  (KTVU via AP, File)  The Russian crash also involved a Tesla claimed to be on Autopilot hitting something to the left and bursting into flames, but had differences as well.

ASSOCIATED PRESS

Russian reports detail an injury accident where a Tesla hit a tow truck parked on the left side of the road, with another car in front of it.  With minimal shoulder, cars have to go around the truck.   The driver claims he was driving his Model 3 with Autopilot on, but was briefly distracted, so did not, as he is supposed to with Autopilot, take control in this tricky situation and steer around the truck.    The car brakes shortly beforehand then hits a glancing but serious blow, injuring the driver and bruising his children.   After the accident, the vehicle caught fire and, as electric cars can do, burned for quite some time.  Electrek has more details.

Tesla has not confirmed that Autopilot was in operation.  The brake lights seen may have been generated by Autopilot or Tesla’s “Automatic Emergency Braking” system which is always on, and should have generated a loud beeping warning before the collision to get the attention of the distracted driver.

While Autopilot is a driver assist system, and not (yet) a robocar, Tesla claims it is just on the verge of being one, and that they can produce a “full self driving” car some time this year.   As such, it is reasonable to examine why Autopilot might have hit this vehicle even though the driver failed in his duties.

This is just the sort of situation that concerns people about computer vision systems, the rare situation that the team has never seen before.  Traffic is not that heavy, so cars are just turning slightly to avoid the tow truck which is jutting out just a couple of feet into the lane.   The tow truck is slightly blocked by the car it’s going to tow which is narrower and juts less into the lane.   This is not an everyday occurrence.

Tesla has proudly shown their answer to this.  When they find something rare, they have the ability to run a sort of “search engine.”  They send instructions out to the hundreds of thousands of Teslas driving the roads, saying, “If you see something like this, please send us images of it.”   In theory, they should then get lots of examples.  They can label these images and feed them into the machine learning system so that their tools can identify all variants of the unusual obstacle.  They can make sure they will recognize it when they see it.

Tesla will certainly do that now, after this accident, if Autopilot was indeed on and failed.  The problem is, this situation really isn’t that new or unusual.  In fact, there have been other accidents with Teslas hitting work vehicles stopped and jutting into the left lane before.   It’s a classic hard problem because if one of these is on the road, and cars are going around it, you may not see the stopped vehicle until it is suddenly revealed when the car (or worse, truck) in front of you pulls right to avoid it.   So the question of stalled vehicles partly in the left lane is not new, and Tesla should have run that search engine long ago, and several times.   Yet, it appears that this technique may not have worked in this instance.  We won’t know why unless Tesla tells us more.

As a driver assist system, it is fully acceptable that Autopilot doesn’t handle unusual cases like this, and depends on the alert driver to handle them.  (Well, some argue that if the system makes drivers less alert, it is not acceptable, but for now it is.)   The issue is that Tesla asserts they will have much more advanced “full” self-driving which is  “feature complete” though not perfect in 2019.   This accident suggests they are further away — possibly much further — than that.

Tesla’s “search engine” ability is very valuable, though not unique.  Anybody who has a very large archive of millions of miles of video can search it.  It’s not necessary to search the future, the past is fine.  Tesla has the advantage that the customers pay for the cost of all this driving and searching, while other companies had to work harder for it.   A more unique ability Tesla has is “shadow mode” testing.  Once it does train its system to do better on stalled tow trucks, it can now have large numbers of live driving cars run the new software to see how it does, and compare it to the old software in the real world.

Tesla is sometimes a fairly transparent company, but at other times it is a highly secretive one.  Unfortunately, when it comes to details like this, they have not been very forthcoming with information.  One reason, at least in the USA, is that if the NTSB gets involved in an investigation, it forbids the company from saying anything.   Last week, the FOIA request about Tesla’s battle with NHTSA and claims of safety contained a few pages from one of those investigations.

This report detailed an accident where a car drifted out of its lane into a guardrail.  Tesla said “autopilot was not on”  but that’s a bit misleading.  This driver was a regular user of Autopilot, but seems, from the reports to maybe be one of the “bad ones” who treats it like a full self-driving car.  Tesla’s revealed logs show the driver constantly got warnings because it did not detect hands on wheel, and ignored them.  Tesla can’t detect hands on the wheel, but instead it detects if your hands are applying steering force to the wheel from time to time.  It is common to get the warning when your hands are constantly on the wheel.  This driver ignored visual warnings, then audible warnings and finally responded when the car started slowing down!  This not normal.   To respond to the warnings, you must torque the wheel, which he did.  But if you torque the wheel more than a certain amount, you disengage autosteering but leave cruise control on.   The Tesla logs confusingly state that cruise control was disabled, not autosteer, though it is not normally possible to do that.   In any event, it is possible the driver thought he had responded to the warnings with a small tweak, as one usually does, and then it can be speculated that he went back to improperly using it, ie. doing something like looking at a phone.  The vehicle, however, was not in autosteer mode and did what any car will do (newer Teslas have some protections against this) and drifted out of its lane and into an accident.

This is one of the big questions asked about good driver assist systems.   To what extent do they lull drivers into thinking they can treat them like a self driving car.  It may be that his driver made that mistake and paid that price.

 



READ NEWS SOURCE

Also Read  Can An Electric Car World Handle Thanksgiving Travel?