Technology

Self-driving cars are dangerous in the wrong hands


Approaching a red light the other day, I half-expected my car to stop by itself. A split second later, I realised it would not and put my foot on the brake. That is the problem of driving a car that has half a mind.

We have long been promised that fully self-driving cars are about to take to our roads, but the technology remains out of reach. “It’s an extraordinary grind . . . a bigger challenge than launching a rocket and putting it in orbit around the Earth,” John Krafcik, chief executive of Google’s sister company Waymo, told the Financial Times this week.

Instead, there are vehicles such as my new Volvo, equipped with “pilot assist” — software that keeps it cruising at safe speeds and steers it on highways. When your car can slow down and halt behind the vehicle ahead, but ignores a red light on an empty road, it gets confusing.

Semi-autonomy is enjoyable: it can be a relief to let the machine bear more of the responsibility for highway driving. But this was intended to be a first step along the path to full autonomy — the future imagined by Volvo in a video is of a woman reading a book, then nodding off safely, in a speeding vehicle.

That future is not arriving any time soon. Uber last month abandoned its efforts to build a fleet of self-driving robotaxis, folding its operations into the Amazon-backed rival Aurora. In 2016, Travis Kalanick, Uber’s co-founder, described autonomy as “basically existential for us”, but the company no longer thinks so.

Battery-powered electric vehicles and self-driving cars were once treated as joint phenomena — Elon Musk put “Autopilot” software in every Tesla and made grand claims about it. But their paths are diverging as electric car sales grow rapidly, while autonomy stalls.

This leaves drivers in an ambiguous and potentially dangerous position, with cars that often appear to be capable of driving themselves, but still need to be supervised closely. One study dubbed this “co-driving”. Out on the road, the discipline is quite different to the traditional kind.

The good part first, for those who have not experienced it. Driving on an uncrowded highway in my car is pleasing: it maintains the right speed, slowing and accelerating smoothly in sympathy with cars in front. It steers around gentle bends, tracking its progress with a camera and radar sensor.

© Alamy

This has the potential to make our roads safer. Even if you co-drive correctly, with hands on the wheel and eyes on the road, it is more restful than driving yourself. That reduces fatigue, one of the main causes of highway accidents. The car also has a useful ability to avoid collisions in an emergency.

The bad part is ambiguity over who or what is in charge at any time. A series of crashes made carmakers more cautious about giving customers false confidence about semi-autonomy. But drivers still have to remain alert and realise when to take back control.

My vehicle signals when it needs help with dashboard alerts and a subtle shake of the steering wheel. After a while, one gets accustomed to the prompts and better at predicting when to override the software, which is easy enough. But it is a new skill, and not one that learners are yet taught.

It also requires the willingness to learn. One danger with semi-autonomous driving is risk compensation: the phenomenon of people taking greater risks when they feel safer. Safety innovations such as Volvo’s invention of three-point seat belts in 1959 have reduced driving fatalities (in 2019 there were one-third of the 1975 number per mile driven in the US). But the roads remain perilous, with 36,000 dying in US crashes that year.

It is easy to observe drivers bending the rules in semi-autonomous cars — there are YouTube videos of men zooming along in Volvos with hands only sometimes on the wheel. The “safety driver” in an Uber autonomous test vehicle that killed a pedestrian in 2018 was later found to have been streaming a television show on her phone.

The danger is getting stuck in the middle, as more cars are equipped with software that must be overseen by fallible humans, while the promise of full autonomy recedes. Volvo itself hopes to carry on advancing — it plans to launch new cars in 2022 capable of driving themselves on some highways without constant monitoring.

The company also wants to equip its cars to intervene when the driver makes a mistake, rather than the other way round. “We really think this is the next big thing in safety,” says Odgard Andersson, chief executive of Volvo’s self-driving software arm, Zenseact. I hope so too, but it requires a leap in the technology.

That has been promised before, so a lot depends on companies such as Waymo and Volvo making it happen. For now, I will keep co-driving with due care and attention.

john.gapper@ft.com



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.