Transportation

Even With Automation, Drivers Still Need To Remain Focused


Many vehicles these days have technology aimed to help with driving, but drivers still need to be involved at all times.  And automated systems should be designed to allow the driver and the technology to share the driving, with built-in limits that prevent use when it isn’t safe.

Those are the highlights of a new analysis released on Thursday by the Insurance Institute for Highway Safety, a nonprofit financed by the insurance industry. The safety group examined Level 2 automated driver assistance systems to find that they and other partially automated systems need stronger safeguards to keep drivers alert and roads safe.

“Unfortunately, the more sophisticated and reliable automation becomes, the more difficult it is for drivers to stay focused on what the vehicle is doing,” David Harkey, president of the Insurance Institute, said in a statement.  “That’s why systems should be designed to keep drivers actively engaged.” 

The institute developed a series of guidelines for how automakers can better ensure that drivers remain focused even as vehicles do more of the work. 

Currently, there are five levels of automation, ranging from 0 (no automation) to 5 (fully self-driving). The most advanced available in vehicles today is Level 2 (like Tesla’s Autopilot and GM’s Super Cruise), which continuously controls acceleration, braking and steering to keep the vehicle traveling at a set speed in the center of its lane while maintaining a selected following distance from the vehicle ahead. 

“These systems are amazing feats of engineering,” Alexandra Mueller, a research scientist at the  Insurance Institute and lead author of its recommendations, said in a statement. “But they all suffer from the same problem: They don’t account enough for the behavior of the human being behind the wheel.”  Systems, she and colleagues said, ideally should require the human drivers to remain vigilant and ready to intervene when the system encounters a situation it cannot handle.

Also Read  China’s Trip.com Posts $131 Mln Loss In 4th Quarter As Pandemic Drags On

A survey conducted last year by the institute found that consumers often put too much trust in partial automation; they think Level 2 systems are practically self-driving. Previous research has shown that the more sophisticated and reliable automation becomes, the harder it is for a driver to remain vigilant, which increases the temptation to do other things, like text or check email.  

The analysis indicated that several high-profile fatal crashes demonstrated how dangerous such lapses can be and how essential it is that they require the supervision of a fully attentive driver. In a fatal crash involving Tesla’s Autopilot system, for instance, a Model X failed to properly detect the lane markings at an exit ramp and crashed into a highway divider. The driver, who was killed, was playing a game on his cell phone at the time of the crash, the institute noted.

A National Transportation Safety Board (NTSB) investigation, concluded in February, found that Autopilot’s limitations, the driver’s overreliance on the technology and his own distraction led to the crash. The NTSB called for the development of standards for driving monitoring systems “to minimize driver disengagement, prevent automation complacency and account for foreseeable misuse of the automation.”

The Insurance Institute’s researchers specifically recommended against enabling automatic lane changing and overtaking systems. Currently, for example, Level 2 automation offered by BMW, Mercedes-Benz and Tesla can automatically change lanes when the driver triggers the function with the turn signal. In pre-mapped areas, Tesla’s system goes even further; its Navigate on Autopilot feature can change lanes and exit the freeway without any trigger from the driver, the institute noted:

Also Read  How The ‘Tesla Effect’ Is Crushing Used Luxury Car Values

“Even if these systems are capable of performing such maneuvers safely in most situations, drivers are more likely to lose track of what is happening on the road when their role in lane changing and overtaking is reduced to the flick of a lever. A false sense of security may cause drivers to initiate the automatic procedure without confirming that the lane next to them is empty, as some user manuals instruct.”                                                                                                 

Systems should be designed to share control with the driver to prevent inattention, institute researchers said, but few currently embrace that philosophy. “Drivers feel more comfortable with systems that don’t fight their input, especially when navigating curves and making other challenging maneuvers,” Harkey added.

The institute’s researchers recommended more robust and multiple methods of monitoring to gauge whether the driver is paying attention and re-engaging the driver when focus wanders.  These include: a driver-facing camera, measuring things like manual adjustments to the steering wheel, and how quickly the driver responds to visual, audible and physical attention reminders and alerts, used separately and in combination. If the driver still fails to respond after a series of escalations, the system should deploy hazard lights and gradually slow the vehicle to a stop.

No manufacturer currently incorporates all recommended measures, the institute noted. 

Some systems switch themselves off if the driver fails to respond to repeated alerts. If this occured when a driver was incapacitated when a lane-centering system had been employed, for example, neither the driver nor the technology would be steering.

“Because these systems still aren’t capable of driving without human supervision, they have to help prevent the driver from falling out of the loop,” Mueller added.  

Also Read  Boeing’s Emails Show The Company Has The Answers – Calhoun Just Needs To Listen

For more details about the analysis and guidelines, click here and here.



READ NEWS SOURCE