Transportation

What You Should Do If You See An AI Self-Driving Car Behaving Badly


Humans can be troubling.

How so?

Consider your daily commute.

The odds of coming across a driver that is doing something untoward is seemingly quite high. You might be minding your own business, calmly driving along, and out of the blue, another driver does something quite wacky.

On a daily basis, most of us will observe a car that swerves across all lanes of traffic to make a panicky turn that they had failed to prepare for. We will likely see drivers that cut off other cars and force those poor souls into having to hit their brakes or make a sudden maneuver to avoid rear-ending the imposing vehicle. There is a solid chance that you’ll come across a driver that crazily rolls through a stop sign and confounds both pedestrians and nearby cars that are underway.

The variety and volume of wayward drivers are astronomical.

Yet we typically take most of the foul driving in stride and just assume that it is the way of the world.

You would love to report the driver to someone and make sure the vile driver gets sternly rebuked, possibly even have their driver’s license revoked. Get those miscreants off the road, your mind screams. But the reality is that the fleeting act of those careless drivers is not readily converted into them getting nailed for their unruly driving. Unless a police car happens to come along and witness the outlandish and likely illegal driving, the whole shebang boils down to your word against theirs (and they would seem unlikely to freely admit to their driving transgression).

When a car gets into a crash or collision, this can be an occasion whereby a driver that is doing rotten driving can potentially get noticed. Of course, the odds are they will defend to the teeth that it was either purely an accident as though nothing could have been done to prevent the pileup, or they are bound to claim they weren’t at fault and somebody else was. This all translates into the odds being unfortunately low that even after getting into a car wreck that they will get summarily dinged for being a lousy driver.

There is a bit of difference between someone that on a one-time basis makes a bad driving choice versus someone that drivers terribly all of the time. The thing is, you might not have any means of knowing which is which. A driver that cuts you off might be the type of person that does this to everyone and always, or it could be a driver that made an honest mistake and regrets profusely the intrusion.

In California and similar to many other states, their respective Department of Motor Vehicles (DMVs) have various complex rules about designating a driver as negligent with regard to the driving task or potentially incompetent at driving. A lot of hoops have to be jumped through to get somebody onto that list. If you perchance come across a driver that you believe is negligent or incompetent, you pretty much have an extremely slim chance of adding their name to that bad driver’s compilation (it will take a massively persistent effort with tons of unshakable evidence).

Besides the everyday response of having to grit your teeth and put up with a badly behaving driver, you could consider taking more overt action. For example, you could potentially call 911 to report the driver. This would seem prudent if you genuinely believe the driver is a roadway hazard and a ticking time clock that might soon crash into other innocent drivers and pedestrians.

Making the decision to contact 911 is not an easy one.

You don’t want to waste the time of the police. You don’t want to be that type of person that makes superfluous calls to 911. You don’t want to send the police on a potential wild goose chase. There are a ton of bona fide reasons to not call 911. Presumably, you would only use the 911 option as a last resort in the sense that you fully believe there is an imminent and possibly deadly situation brewing.

It is tough to be a Good Samaritan.

Another possibility involves trying to warn other traffic about the disruptive driver.

You might honk your horn or flash your headlights, doing so to alert drivers nearby. This can backfire on you, simply due to the somewhat obvious aspect that you are drawing attention to you, rather than to the car that you believe is being improperly driven. Other drivers will think you have a screw loose. This might cause them to overlook the dangerous driver. You have somewhat shot your own foot, leading to the bad driver continuing along, and meanwhile, there are a bunch of wary drivers probably eyeing you like a hawk.

You could try to do something about the troublesome driver.

Perhaps position your vehicle in front of the miscreant to get them to slow down or stop. I can emphatically say this, decidedly do not take those kinds of actions. Your desire to rectify the situation could readily worsen things considerably. The other driver might react in a fit of road rage. Drivers coming upon the two of you won’t know what led to the altercation. And some of those other cars are likely going to end up in a conflagration with the two of you during the potentially deadly tangling.

Most authorities including the police would sternly say that you should not try to engage in a dogfight between cars. You would be better off staying back from the bad driver and report the situation to the police. Again, as mentioned earlier, you don’t do this at the drop of hat, and reserve such direct reporting only when things are relatively dire.

One aspect that probably goes through your head is the wonderment of what is going through the noggin of the other driver. Why did that person make that nutty driving move? Are they completely witless? Was there some reason to drive so badly?

We all make use of something referred to as the “theory of mind” when we mull over what another driver is doing. The notion is that you try to imagine what is in that person’s mind. It could be that the person has a perfectly rational basis for making a bad move. On the other hand, perhaps the person is intoxicated. Maybe they are angry because of something that happened at work that day. We try to put ourselves into the shoes of the other driver and figure out what is going on in their mind.

Suppose you are behind a driver that surprisingly hits their brakes and yet you don’t see any discernable reason for doing so. A few moments later, you see a dog that was hidden from view and had been in the middle of the street, for which the driver opted to go to a screeching halt to avoid. You are likely to feel somewhat sheepish that your first thought was that the person was a lunatic. That being said, there is some concern that they could have caused a car crash and perhaps might have handled the predicament with more aplomb.

Anyway, this takes us back to the earlier point that you might not know from a single instance whether the driver is altogether bad or whether it was an ad hoc one-off situation. Sometimes you need to observe the other car over a sizable distance and detect numerous clues that accumulate into a more strident case that the driver is a maniac at the wheel.

Shifting gears, consider that the future of cars consists of self-driving cars. These AI-based true self-driving cars are driven by an AI driving system. No human is at the wheel. For my extensive coverage of self-driving cars, see the link here.

There are a lot of myths and misconceptions about self-driving cars.

Some pundits seem to declare that the beauty of self-driving cars is that they will never ever do anything wrong while on the roadway. They will be perfect drivers. In addition to strictly driving in a legal fashion, you will apparently never need to worry about a self-driving car that perhaps cuts you off, or stops in the middle of the road, or does anything else that a wayward human driver might do.

Those that spew out such nonsense are living in a dream world. The kind of perfect driving that they envision is either akin to a Utopian society that I assure you won’t become reality, or they are perhaps conflating those putt-putt cars at the theme parks with driving in the real world.

My point is that self-driving cars will indeed drive badly, from time to time, and though it hopefully will be a rarity, it is certainly not going to be a big fat zero. Anyone that throws around the word “never” is taking an outsized risk of being readily proven wrong by even the occurrence of one such instance. In that way, they are absolutely out to lunch since there will be at least one (and a lot more).

This is especially the circumstance right now as the use of self-driving cars is being tried out on our public roadways whilst in the midst of still being devised and readied for open-ended use. You might have heard or read the brouhaha the other day about a self-driving car that got immersed in a construction zone and took driving actions that could be described as undesirable.

No one was hurt, and you should be mindful that this was a case assuredly not worthy of outstretched headlines and hand wringing. Save that for the problems associated with cars that aren’t at all self-driving and yet being portrayed outrageously as though they are.

I’d like to forgo discussing the false self-driving cars or ones that are clearly not self-driving and focus herein solely on the AI-based true self-driving cars.

The intriguing question is this: Will the advent of AI-based true self-driving cars entail having the AI driving system at times drive badly, and if so, what should you do about it as a nearby driver in a human-driven vehicle?

Before jumping into the details, I’d like to clarify what is meant when referring to true self-driving cars.

Understanding The Levels Of Self-Driving Cars

As a clarification, true self-driving cars are ones that the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.

These driverless vehicles are considered Level 4 and Level 5 (see my explanation at this link here), while a car that requires a human driver to co-share the driving effort is usually considered at Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend, see my coverage at this link here).

Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different than driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable).

For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.

Self-Driving Cars And Behaving Badly

For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task.

All occupants will be passengers.

The AI is doing the driving.

One aspect to immediately discuss entails the fact that the AI involved in today’s AI driving systems is not sentient. In other words, the AI is altogether a collective of computer-based programming and algorithms, and most assuredly not able to reason in the same manner that humans can.

Why this added emphasis about the AI not being sentient?

Because I want to underscore that when discussing the role of the AI driving system, I am not ascribing human qualities to the AI. Please be aware that there is an ongoing and dangerous tendency these days to anthropomorphize AI. In essence, people are assigning human-like sentience to today’s AI, despite the undeniable and inarguable fact that no such AI exists as yet.

With that clarification, you can envision that the AI driving system won’t natively somehow “know” about the facets of driving. Driving and all that it entails will need to be programmed as part of the hardware and software of the self-driving car.

Let’s dive into the myriad of aspects that come to play on this topic.

Imagine that you are driving along and perchance a self-driving car is ahead of you. The autonomous vehicle seems to be abiding by the lawful rules of driving. There is nothing out of the ordinary taking place, other than this is a self-driving vehicle and there is no human driver at the wheel.

As an aside, I realize it might seem unusual to be near a self-driving car while on the public roadways, but this is relatively common in select locales. On a daily effort of driving around in the Silicon Valley and San Francisco area, you would perhaps be surprised to know that you are bound to regularly encounter a self-driving car in traffic. For some human drivers in that region, the appearance of a self-driving car has become an altogether ho-hum matter and accepted as the norm. You can expect that this will inevitably be the case throughout the country.

Back to our saga.

The self-driving car ahead of you is coming up to a part of town that has some quite wicked streets that are replete with potholes, wide cracks, faded street markings, etc. Turns out that as you get closer to those bad spots, you notice a road crew is doing some roadway repair work. Thank goodness, finally, the city is fixing this mess.

There is an electronic display board flashing a warning that you are soon entering into a construction zone. Red cones have been placed to guide traffic around the workers. The proper path through this zone is somewhat confusing, but human drivers are determined to proceed and are maneuvering and jockeying for position to navigate the morass.

The self-driving car seemed to at first figure out where to go. It moved over into the narrowed open lane. Unfortunately, the cones have been shoved around by other passing vehicles, and the AI driving system appears to now be confused. It starts to shift out of the open lane, then stops, then shifts back into the lane, then pops out of it. Finally, the self-driving car comes to a standstill.

The self-driving car has become a blockage and is straddling the only open lane. Human-driven cars are sitting behind self-driving cars and waiting for the AI driving system to figure out what to do. Patience is thin, anger and angst are brewing. Some of the waiting drivers lean on their horns. Other cars begin to veer directly into the construction area to try and make their way around the halted self-driving car.

You are also pinned behind the self-driving car. There is no ready means to proceed forward. You would need to either go head-to-head into oncoming traffic or try to sneak into the construction area and endanger the road crew.

Is this an instance of a self-driving car behaving badly?

You could certainly say so.

The AI driving system has abruptly stopped the vehicle, doing so in a precarious spot that entails an active roadway with numerous cars and lots of ongoing traffic. In theory, self-driving cars are not supposed to do this. The overall general requirement or preference is that if a self-driving car is going to halt, it does so in a manner that offers a so-called Minimal Risk Condition (MRC). See my detailed discussion about MRC at this link here.

Some might try to argue that the self-driving car is not threatening anyone per se. It did not try to run anyone down. It did not nearly sideswipe another car or rear-end another vehicle. As such, the case could be made that though the act of coming to a dead stop in an active lane is undesirable, it does not rise to a level of notable concern.

A retort would be that the self-driving car has landed into an illegal driving action. This breaks the touted myth that self-driving cars will never drive illegally. Furthermore, by blocking traffic, the self-driving car is clearly in the wrong and impeding the flow of traffic. Worse still, human drivers are now going out of their way to circumvent the stopped self-driving car, and are likely to get into dangerous postures as they do so (well, one realizes that this could be construed as the fault of those human drivers, though there is fault to be laid at the feet of the self-driving car too).

What should you do, if anything, about the self-driving car that has unexpectedly and wrongly halted like this?

Suppose a human driver had done the same thing.

If so, you would probably be yelling at the driver to get the heck out of the way. Unfortunately, for most of today’s self-driving cars, there is no means to proffer external instructions or advice to the AI driving system (see my column for how this might take place with more advanced AI driving systems).

You would also likely be thinking about what the driver is thinking about. This goes back to my mentioning a “theory of mind” whereby you try to guess what other drivers are doing via what is going on in their noggin. For self-driving cars, you are unlikely to have any means of reasonably guessing at what the AI driving system is undertaking. Your best guess is that perhaps like any computer that sometimes gets overloaded and freezes up, maybe that’s what has happened with the self-driving car (a plethora of possibilities exist).

Another aspect would be to consider contacting the company that makes or operates the self-driving car. Perhaps the fleet operator doesn’t realize the predicament of the self-driving car. In theory, they should, and the AI driving system should have sent out an alert to the fleet operator, though this might or might not have been included in the system capabilities, plus there is a chance that the vehicle is in a place that has little or no viable electronic communications to be broadcast out.

Even if the self-driving car did send out an alert, you would have no means of knowing that it had done so. In other words, perhaps a team has been dispatched to come and aid the self-driving car toward getting back underway. That team might be many minutes away and therefore you would be stuck there and not aware that help is coming. All that you would see is that the self-driving car is unmoving and blocking traffic, dangerously so.

One would hope that at least the self-driving car would invoke its emergency flashers as a signal to others that it is halted. This might happen, or it might not happen. There is a chance that the AI driving system hasn’t derived that the situation is deserving of using the emergency flashers. Another possibility is that the emergency flashers can only be activated by a remote agent of the fleet. And so on.

On a related note, the odds of your being able to contact someone about the self-driving car is also confounded because few of the self-driving cars are outwardly well-marked with appropriate contact info. I’ve repeatedly emphasized that self-driving cars should have those bumper stickers or similar markings that state how to call, text, or make contact with a fleet operator or whoever can cope with that self-driving car when it has gone afoul (see my discussion at this link here).

Meanwhile, for those crazy daredevils that often try to take matters into their own hands, someone may try to come up to the rear bumper and push the self-driving car out of the way. Bad idea! As earlier stated about messing around with human-driven cars, you should not be messing around with self-driving cars either.

Okay, after considering all those options, you could potentially call 911. Per the prior discussion, the calling of 911 is supposed to occur only in the direst of traffic settings. Does this constitute such an instance? Given the potential for either the nearby roadway crew getting hit or for human-driven cars to ram into each other, a solid case could be made that this warrants a 911 call.

The rough thing is that the self-driving car might suddenly start moving again before any 911 emergency services arrive. It could be that a remote agent has communicated with the AI driving system and indicated where the AI should drive the car too. By the time the emergency services get there, the self-driving car could be long gone.

Conclusion

The focus in this brief exploration was a self-driving car that got itself stuck into a dour position, doing so of its own volition.

This reflects that the AI driving system is inadequate and has not been sufficiently programmed to handle some rather common situations that are nonetheless framed as edge or corner cases by some AI developers (there is a great deal of debate about what constitutes a rare or oddball driving scene versus what should be construed as every day expected).

The key here is that you cannot try to blame the AI as though it is the AI per se, i.e., the AI being sentient. It is not sentient. You have to look at the automaker or self-driving tech firm that has devised the AI driving system and hold them responsible for what their technology and vehicle are doing. Do not shrug your shoulders and wave your hands that this is somehow just the vagaries of AI.

Keep in mind too that a slew of other situations can readily reveal the foibles or weaknesses of the AI driving system (at least with respect to that particular version, for that specific make and model, and should not be generalized to all self-driving cars). The saga provided is a somewhat tame or tepid instance.

Worse cases will undoubtedly arise.

You also need to take into account that there will be malfunctions of an automotive nature, beyond the AI driving system itself. A tire can suffer a blowout on a self-driving car. The under-the-hood mechanical elements of a self-driving car can bust or break.

Some seem to think that a self-driving car will not ever be subject to the same physical and mechanical breakdowns of human-driven cars, but this a ridiculous and foolish assumption. Right now, the few self-driving cars on the roadways are being nightly pampered with tiptop maintenance service and ergo unlikely to have any mechanical issues while underway. This might not be the case once there are thousands upon thousands of self-driving cars on our roadways.

That about wraps things up.

As a recap, human drivers are known to behave badly. We don’t yet seem to widely realize that self-driving cars can behave badly too, which we need to realize can and will happen, and therefore be prepared to respond appropriately.

Always be prepared is a handy motto.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.