Transportation

A Robocar Specialist Gives Tesla ‘Full Self-Drive’ An ‘F’


I’ve owned a Tesla

TSLA
since 2018 and I’m a huge fan of the car. My background is in self-driving cars. In addition to writing about them for 15 years, I’ve worked for and advised a wide variety of companies including Waymo, Zoox, Cruise, Starship, giant car OEMs and many others. So naturally been very interested in both Autopilot and the promised future system Tesla calls “full self driving.” Tesla has released a prototype version of that system out to some Tesla owners, and I finally got it recently. I’ve watched many videos of it in action, some good, some bad, and wanted to see it for myself. So here’s my review, including a video of a sample drive with sadly, too many mistakes.

(The text of the video review is below, so if you read this text you can skip to the 2nd chapter of the video, about 5 minutes in, to see the sample ride.)

I have great respect and admiration for Elon Musk, so sorry to say this but … it’s terrible. I mean really bad. After all those videos I didn’t expect a lot, but I expected more than this. My first drive home after activating it was frightening. You’re going to see the second loop I did, one around Apple’s

AAPL
Headquarters in Cupertino California. I’ve now driven this loop a dozen times with the system on, and each drive is different, with a different pattern of errors, several of them serious.

This is not a complex set of roads. It’s typical Silicon Valley Suburbia, the same valley where Tesla’s HQ and most of its developers are. There’s a fast arterial and some afternoon traffic, but aside from the straight sections, there’s no turn or other complexity on the route it didn’t screw up at least once in my loops.

Self driving is a very different problem from driver assist, such as we find in Autopilot, a product I enjoy using. Some think it’s just a difference of degree — more think they are two different things. Even if, like Tesla you think you just keep improving driver assist until you have self-driving, this system just has so far to go.

One thing that’s not apparent in this video and in others is how jerky the ride is. Rides are full of sudden accelerations and braking and jerky turns which would quickly get you fired as a chauffeur. Those don’t cause safety issues — in fact they are no dobut because of safety — but they show the roughness of the product. It’s particularly bad when making unprotected turns. It starts timid, as it should, then advances and jerks when it decides to be timid again. While this will get fixed in time, these rough edges show just how immature the product is today.

Waymo’s self-driving cars have now gone probably 8 million miles in Arizona without every being at fault in an accident. Of course, in that distance they have made many small mistakes, some matching what I am describing from my Tesla, and Waymo doesn’t tell us about all of these. Even so, it’s orders of magnitude fewer. My Tesla seems to not be able to go more than a few non-highway miles without making mistakes of various levels of severity.

Fortunately, I did not have a crash. As one must, I intervene if the system looks like it is doing something dangerous. In some of those situations, it’s quite possible the car would have corrected itself and not hit anything. One time when it tried to run a red light is an example of when it clearly wasn’t going to make it better.

A lot of times it was just too timid. Self-driving requires being safe, but also being a good road citizen. So many times in my trips I found the car blocking other cars, and even getting honked at. In those cases I intervened to tell it to go, and in some cases, it seems it was never going to go otherwise.

So I’m giving Tesla FSD an “F” when it comes to self-driving. In fact it clearly shouldn’t have that name, as many have pointed out. It should have a driver-assist name, so I will call it “Street Autopilot”. The problem is I have to give it a “D” as a driver assist product. Using it is a harrowing experience. It’s definitely not relaxing or providing assistance.

To be fair, I and many wondered if Autopilot could be a pleasant and relaxing product. If you use it right, on the highway, it is. I saw that quickly after using it. It’s hard to see that for Street Autopilot, even if it gets a lot better. With highway Autopilot, it takes a gently winding highway, and makes driving on it like driving on a beeline straight road with light traffic. You’re still driving, but it’s easier. Decisions are rarely quick. You have to focus, but on different things. In traffic, it does what any adaptive cruise control does to take the stress out.

Street Autopilot is much harder to make relaxing. The car is now making sharp turns and quick decisions. Other road users, including pedestrians are coming from all directions. You have to stay fully aware. When you drive yourself, you kow your plan in advance, and you know when things are going according to your plan. With Street Autopilot, it’s doing the planning so its moves can be a surprise to you. This probably gets a bit better when you get more used to how it drives, but I haven’t reached that yet. I’ll report more later on that.

About 5 minutes into the video you will see commentary on various situations where it had problems including:

  1. Yielding too long at a 3 way stop, even though it was clearly there first
  2. Veering towards a trailing on the side of a quiet street
  3. Being very slow turning onto an arterial and getting honked at
  4. Pointlessly changing lanes for a very short time
  5. Failing in many ways at a right turn to a major street that has its own protected lane, almost always freezing and not knowing what to do
  6. Jerky accelerations and turns
  7. Stalling for long times at right turns on red lights
  8. Suddenly veering off-course into a left turn that’s not on the route, then trying to take that turn even though the light is red!
  9. Finding itself in a “must turn left” lane and driving straight out of it, or veering left into oncoming traffic
  10. Handing a basic right turn with great uncertainty, parking itself in the bike lane for a long period to judge oncoming traffi
  11. Taking an unprotected left with a narrow margin, and doing it so slowly that the oncoming driver has to brake hard.

All of these in a simple 3.5 mile loop in a suburban residential/commercial area. (They didn’t all happen on one drive, but most drives had several of them, and each drive had a different pattern of errors.)

I haven’t used the product long enough to be 100% sure that this loop isn’t especially problematic, but I doubt it. I’ve seen these errors in many videos.

So let’s rate these results by examining the core competencies of a self-driving car

Mapping

Many of these faults can be blamed on Tesla’s decision not to have detailed maps. Tesla uses navigation maps and some lane-level maps, and even has (but doesn’t admit it) detailed maps of certain tricky areas. But it doesn’t have enough, and many of these problems would not have happened with better maps. In particular, not knowing about what to do with the protected right turn lane, and the unusual lane geometry at the entrance to Apple Headquarters. The strange left turn that ended up going through a red light also might have been prevented.

Tesla instead draws is maps on the fly as it drives. And it often draws them wrong. It’s impressive that it can do them right a large fraction of the time, but a large fraction is not enough.

Localization

While the first task for most cars that use maps is to figure out their position on the map, Tesla draws most of its own maps and so is automatically at the origin of them. It has a more basic job of matching them to its navigation maps. It’s not clear it does this right all the time, though.

Perception

Tesla’s most controversial decision is to use only cameras for perception, and not LIDARs or even radar as almost all other teams use. My tests are not extensive enough to give it an accurate perception score. 99.99% would still be a seriously inadequate perception score, and measuring that requires detailed work. It is clear though, looking at Tesla’s perception visualization that many objects on the road are winking in and out. That’s not entirely uncommon — many other perception systems will temporarily lose track of objects on the road. Though the less of this the better.

Tesla’s visualizations also don’t show very much range to their perception system, but it may have more range inside. With a good perception system, a human could drive the car looking only at the perception display, like a video game.

Forecasting and permanence

After a car detects things on the road that move, its most important job is to predict where they are likely to go. Indeed, a large reason for trying to classify them is to help predict them. Because they do wink in and out, you want to be able to connect detections of different obstacles, since we know there are no Star Trek transporters.

The jerky depictions on the Tesla screen indicate there is still work to be done here.

Planning

In part because the car is changing its mind about the geometry of the road, its planning for where to drive is often erratic. It displays this clearly by showing you the paths it is considering. And several times in my short drives, it made very bad plans, which needed intervention.

It also takes exits in a way that doesn’t match most human drivers, attempting to drive right down the center, which means it will turn later than a typical human driver, for a less smooth turn — and an issue for people supervising it in driver assist mode because they don’t know if they need to intervene. In driver assist you need to telegraph intentions better to keep the driver confident.

Actuation

Most of the jerky driving was due to the erratic planning, but in a few cases the car’s own execution of simple plans was strange. Wild swings of the steering wheel are common. At one intersection, the steering motor sat making odd clicking noises while not actually moving, as though it were being engaged for very short periods.

Road Citizenship

The vehicle scores very poorly for road citizenship (which is part of planning in many ways.) I tried to avoid having cars behind me, and many times I needed to intervene and tell the car to go to avoid being a burden. Since I was testing out the system, I will admit that I was unfair to some of the drivers behind me, but nothing too terrible or I intervened. The car is also sudden with lane changes sometimes.

Safety

The safety record was unacceptable. I would have 1-3 safety interventions in each 3.5 mile loop. While it’s not 100% sure it would not have recovered on its own, if so, it would still be a bad situation to veer off the road and just barely make it back. Particularly if this happens with other drivers around

How to make a Street Autopilot

If Tesla were actually trying to make a Street Autopilot driver assist system, it should do it differently. A driver assist system has to make driving easier, not more scary.

To do that, you would not have the car try any situation it can’t handle reliably. This might mean that if requires the driver to take over for unusual turns and intersections rather than having the driver see if the car handles it this time.

A good plan is to telegraph intentions to the driver as early as possible. I would add a heads-up display to a car of this sort, and on the heads-up display, paint the car’s plan on top of the real world. The driver can then see if that plan makes sense, and allow the car to follow it when it does. The plan should deliberately have red points where it says, “you will be taking over here” so they are visible well in advance.

It might be a fine and usable product if it just gave you options to confirm, like “turn now” and “drive this lane for 3 miles” and all you have to do is be ready for rare errors. And no errors that involve veering into things, obviously. The sooner the car realizes it wants to do something unusual, the sooner it should let you know.

Tesla has a long way to go on this product. It might be nice if they put their energy into things that would be more useful to drivers, while still working on their self-drive ambitions. While it’s very hard for them to retroactively add a LIDAR to their cars, adding mapping is a software problem. Indeed, they could take the approach of companies like MobilEye, which make use of their giant fleet of cars in order to build and constantly update their maps. Tesla is one of a very few companies that has a giant fleet that they can load software into to do the mapping job. Other companies should be jealous of that ability. (Tesla does use the fleet to gather training data for their systems and for some map updates, but doesn’t do more detailed maps.)

Many people have tried Tesla “FSD” (Street Autopilot) and been impressed. This is only because they haven’t worked on real self-driving cars. They think being able to handle many situations is impressive, and it is by some standards. It’s what you don’t handle that matters, though. Tesla’s performance is beyond what Waymo could do a decade ago, though it does it without a map, which is impressive but the wrong path.

Elon Musk has sadly predicted that Tesla FSD would be ready “very soon” for the past 4 years or more. Even fans are starting to be wary of those promises. It still has a long way to go, and Elon and his team need to go take rides in Waymo cars in Arizona or California for a while to reset their expectations. Then get to work.

You can read/leave comments here



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.