cars

Tesla and Uber draw scrutiny at Senate hearing on self-driving cars: 'That's not safe!'


An Uber self-driving car drives down 5th Street in San Francisco, California.

Justin Sullivan | Getty Images

Several Senators offered sharp criticism of Tesla and Uber during a Commerce Committee hearing on self-driving vehicles on Wednesday.

Sen. Roger Wicker, R-Miss., who chairs the committee, referenced a fatal crash involving an Uber self-driving test vehicle in his opening majority statement:

“Ms. Elaine Herzberg was tragically struck and killed by an Uber test vehicle while crossing the street. Records show that the vehicle detected Ms. Herzberg’s presence 5.6 seconds before the crash, but failed to brake. It is imperative that manufacturers learn from this incident and prevent similar tragedies from happening again.”

More than 80 companies are currently testing automated vehicles on the public roadways in the U.S. today. But only 16 of them have provided self-assessments that were recommended by the National Transportation Safety Board (NTSB) and deemed voluntary by the National Highway Traffic Safety Administration (NHTSA).

At today’s hearing, the heads of both agencies faced questions from senators who were, in turns, excited by the promise of self-driving cars, but concerned about issues from driver and pedestrian safety to traffic.

Now, NTSB is urging NHTSA to put some conditions on developers who want to test, market and deploy semi- or fully-autonomous vehicles on U.S. roads.

“Whatever’s working right now is not working as well as we believe it should,” said Robert Sumwalt, the chairman of the NTSB, which investigates crashes and autonomous vehicles involved in them, but does not have the authority to issue any recalls.

Tesla autopilot ‘cheats’

One of the most dramatic moments during the hearing on Wednesday came when Senator Ed Markey, D-Mass. demanded to know what NHTSA was doing to compel Tesla to stop Autopilot “cheats.”

NHTSA has the authority to issue recalls, but hasn’t where Tesla semi-autonomous systems are concerned.

Sen. Markey, citing an NBC Boston report, explained:

“Tesla drivers have identified a variety of tricks to make Autopilot believe they are focused on the road even if they are literally asleep at the wheel. Alarmingly, you can go to YouTube right now and learn about some of these tricks.”

In one easy Autopilot cheat, drivers affix a water bottle or orange to their Tesla steering wheel, adding weight to it. The weight tricks the Autopilot system into believing the driver’s hands are on the wheel — that way, the car will keep steering automatically, even if the driver falls asleep at the wheel or climbs into the back seat.

Tesla cautions drivers to remain attentive at all times while using Autopilot. But because the system can be easily abused, Markey exclaimed: “That’s not safe! Somebody’s gonna die!”

Markey repeatedly pressed NHTSA acting chief James Owens, a witness at the hearing, for details about how his agency would compel Tesla to deliver a fix for Autopilot cheats. Owens said, “It is unfortunate when drivers misuse their vehicles and engage in unsafe behaviors.” He also promised to follow-up with Tesla on this issue.

Sen. Markey snapped at him, “I would urge you to do that very quickly. Tesla should disable Autopilot until it finds the problem, until it fixes the problem, until it can assure consumers who don’t own that vehicle that they are safe on the roads or sidewalks from an accident occurring.”

The senator revealed that he has sent a formal letter to Elon Musk’s electric car company this week, urging them to fix what he viewed as autonomous “design defects.” (That letter is now in the committee’s records, and NHTSA is under pressure to deliver answers.)

Tesla did not immediately return a request for comment.

A comparison with Boeing

Sen. Tom Udall. D.-N.M, suggested that NHTSA was moving too slowly to set and clarify the rules around self-driving vehicles, and warned of deadly consequences.”

“While I appreciate the potential benefits of autonomous vehicles, I remain concerned that humans will be used as test dummies,” said Udall. “The self-certification approach did not work out well for the Boeing 737 MAX 8 and now Boeing is paying the price. We should heed that lesson when it comes to finding out the best way to deploy autonomous vehicles. The public does not want their safety watchdogs getting too cozy with industry. And industry should welcome strong safety regulation as being in their best long-term interest.”

Tesla Chairman and CEO Elon Musk addresses discusses self-driving featur

Visual China Group | Getty Images

NHTSA’s acting chief, Owens, told the committee that the agency is advocating for a common nomenclature around self-driving features so drivers and passengers won’t be confused about how automated vehicles do and don’t work. But overall, Owens indicated NHTSA is not ready to regulate:

He said, “If we establish standards too quickly we run the risk of stymieing innovation. So we want to step back we want to let the innovation occur, and the competition occur.”

He also said truly self-driving cars are “more than several years off.” While NHTSA has not done any official forecasts, he said:

“What we’re finding both from our own research and what we’re hearing from industry is that developing fully autonomous vehicles in a complex surface-driving environment is very hard. It’s very difficult. It’s more complicated and difficult than was anticipated several years ago. The technologies are continuing to be developed and improved, but they’re not going to be here next year or the year after.”

Follow @CNBCtech on Twitter for the latest tech industry news.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.