Tesla’s Activation Of Driver-Watching Camera Shows New Thinking


newest software update — the one which enables more limited Autopilot in the new cars shipped without a radar — also indicates they have enabled the use of the camera above the rear-view mirror for watching the driver. Tesla has refused to do this for a long time in spite of strong pressure to do so, so the capitulation shows an interesting change of heart.

Tesla’s automation systems, even their “full” self-driving system, are driver assist systems which require the driver to pay attention to the road, ready to take over when the system makes a mistake or can’t handle the road. While Tesla is fairly explicit about this, people, being people, quickly learn to disregard that, and many will stop paying attention.

To mitigate that, Tesla requires you keep your hands on the wheel and apply a mild steering force. Not enough to turn the wheel, just enough to know that you’re there. If you don’t apply it for a while, the car concludes your hands are off the wheel and it starts reminding you to apply the force. The warnings get less subtle, and eventually the car will slow down to a stop.

Competing systems like Cadillac “Super Cruise” don’t require hands on the wheel. The use a camera, facing the driver, to make sure the driver is watching the road. You can look away from time to time, but not for prolonged times.

Tesla put such a camera in their cars a couple of years ago but has never enabled it. Such a camera is useful in a robotaxi service to allow inspection of the inside of vehicles and video calls with passengers, among other things. A real robotaxi probably needs a better camera, able to see the whole car, but with a physical shutter so it can’t look at things except between rides, so that it can see if passengers have left things on the seats, or soiled them.

Tesla’s torque approach can be defeated by attaching a weight to the wheel, and some reckless drivers are known to do that do they can treat the car like a robocar and play around with their phone or otherwise ignore the road. Such drivers obviously know they are trying to bypass the system, but because the system is good enough that they get away with it most of the time, people still do it. Or they keep their hand on the wheel but do things like play games or watch movies. In a small number of cases, Teslas on autopilots have gotten involved in serious and even fatal crashes where the driver was obviously not paying attention.

Those crashes are the driver’s fault under the law, but they are also something that many have said Tesla should still do something about. Tesla has refused, until now.

The software change has not, however, caused Tesla to allow drivers to take their hands off the wheel yet. It still nags if you don’t torque the wheel, drivers report. It is commonly expected that at some point it will not do this, and instead nag if you take your attention from the road for too long, or presumably if you block the camera.

There are some interesting trade-offs:

  • Drivers certainly like not having to keep torquing the wheel. It’s more comfortable to just have hands off
  • Torque detection is not hands-on detection. Many drivers forget to apply torque even when they are keeping hands-on the wheel and get the warning when they are acting properly, which is annoying.
  • Gaze monitoring seems to be a better measure of attention to the road. Super Cruise’s accident rate is superb at present, while Tesla’s accident rate with Autopilot on is similar to or slight worse than with Autopilot off.
  • Gaze monitoring appears to be harder to deliberately defeat
  • Hands-on should allow a faster reaction time in the event of interventions
  • Hands-on allows a “drive in your head” approach which is a good attentive way to supervise the car, letting the wheel move your hands but imagining you are moving them. If they don’t move as you expect, you take over.

With these factors, it’s surprising Tesla hasn’t implemented this, or at least made it a choice for drivers. Software to do gaze monitoring has been available for free since Uber

had a safety driver who wasn’t monitored and watched a video, resulting in the death of a pedestrian. Tesla does believe in driver responsibility, and to some degree has resisted “nanny” functionality if their cars, but only to a point. They have the torque monitoring. They let you go over the speed limit with their cruise control (like almost all cruise controls) but only to a certain point. The new radar-less Autopilot limits this even more.

Cars have traditionally been “the driver is fully in charge” technology. Tesla generally supports this, but having made the world’s most computerized car, they have probably also implemented more “the car and the carmaker is in charge” features than anybody else. It’s a strange dual philosophy.

Tesla promises all data from the camera is kept in the car and not uploaded to Tesla. But will that remain true after an Autopilot accident? Will logs of it remain in the car that might incriminate you to something (not just bad driving?)

Another interesting question is whether drivers should have the power to deliberately turn off the camera, officially doing what they do when they tape a grapefruit to the wheel to fake torque? If a driver makes such an overt decision, it should absolve Tesla of responsibility. Certainly no other car monitors drivers in this way without an option. Most drivers would leave it on. While it seems a huge fraction of people knowingly engage in distracted driving (texting in a regular car is much less safe than doing it with Autopilot on) many realize they are being bad and will feel a reminder can stop them from doing something stupid. But if they do something stupid, they can hurt others, not just themselves, at least when on public roads.

Read/leave comments here


Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.