Transportation

When Self-Driving Cars Get Embroiled In Massive Car Pileups


During the winter months, there is always a spate of massive car pileups that woefully take place and make rather noticeable and attention-getting headlines.

Recently, in Iowa, there was a forty-car pileup. This was especially eye-catching because two Iowa State Patrol cars were also enmeshed into the traffic cataclysm and ended up getting hit (thankfully, the officers were okay). In Japan, a mind-blowing pileup occurred that involved at least 134 vehicles, stranding about two hundred people, injuring seventeen, and miserably leading to one death.

Even though these somber events are often referred to as multi-car pileups, the more apt depiction is that they are ostensibly “multiple vehicle” pileups. This alternative naming is reflective of the fact that there are usually trucks involved in the car pileups and thus the calamity strikes not just cars but can encompass other vehicles too.

It seems that all drivers are subject to finding themselves getting embroiled in a chain reaction series of vehicle collisions.

The situation usually entails at least one vehicle that goes amiss, and then others ram into each other in a cascading cacophony. For example, envision that a car veers into the path of a large truck, the truck attempts to avoid a collision and goes into a jackknifed posture. Another car that is behind the truck makes a radical maneuver to avoid hitting the truck. This ploy then leads to a different car hitting the escaping car. Maybe they both then slide into the jackknifed truck.

The point is that the basis for the multiple vehicle calamity is that a series of cars and trucks find themselves getting into a dangerous dance of each trying to avoid the other. In a sense, the avoidance draws other vehicles into the morass. Everyone is doing something to avoid others, and yet they invariably ram into each other.

It is possible that some of those vehicles might luckily be unscathed and able to avoid striking anyone else. This seeming miracle can be short-lived since some other wayward vehicle might suddenly ram into them, as though an afterthought and the hand of fate deciding that all the vehicles are going to be damaged by the chain reaction. You’d have to really be relying upon your lucky rabbits’ foot to be in the midst of a multiple vehicle collision and truly avert getting dinged or banged-up by the experience (if you are that fortunate, probably wise to thereafter go and buy a lottery ticket and test that luck for the furtherance of good fortune).

Not all of the drivers are necessarily on the ball and seeking to avoid the other vehicles that are enmeshed in the quagmire of collisions.

It could be that the event unfolds so quickly that some of the drivers do not have sufficient time to react. They are merely carried into the collisions by the physics involved. Imagine swimming in the ocean and getting carried by a large wave. There might not be much that you can do to cope with the overwhelming forces entailed.

Some drivers do react and yet might try to respond in a manner that makes things worse. Ironically, if they had done nothing in terms of attempting to steer out of the abyss, they might have come out with less damage. Instead, they try frantically to get out of the impending doom and find themselves confounding other drivers accordingly. This makes for a much wider array of confusion and extends the number of vehicles caught up in the adversity.

Time is usually a key factor.

The collisions happen so fast that there is very little time to make profound mental calculations and ascertain what is the best course of driving action to take.  Furthermore, all those other drivers are making mental calculations that are somewhat unpredictable as to what they might be thinking of doing.

In that sense, probabilities are also crucial.

Assuming you have at least some available time to make a driving decision, you are no longer in a rather static traffic situation. When traffic is routinely taking place, you pretty much can anticipate what other drivers are going to do. There is a very high probability that the car next to you will continue forward and similarly it is highly unlikely they will opt to veer unexpectedly into your lane.

All bets are off when a multiple-vehicle pileup ensues.

The car driver to your left might think it is best to come into your lane. You might be thinking that the neighboring car ought to head into the emergency lane, but it seems this is not going to be the case. All of your normal assumptions about driver behaviors can get tossed into the air and become quite scrambled.

You’ve got some drivers that are trying to self-optimize and minimize their chances of getting hit. Some drivers are trying to minimize their chances of hitting other cars. And some drivers aren’t doing any mental contortions and are being drawn into the evolving event without any direct effort of avoidance.

This is somewhat like a chess game that has gone amok (for more on this, see my discussion at this link here).

You cannot assume that the other drivers are still playing the car driving game of chess by the proper rules. It has become every player for themselves in a do-or-die gambit. Worse still, some of the players are bad at this kind of chess game, namely one involving a multiple vehicles collisions scenario. Plus, there are those playing the game that is essentially asleep at the wheel, which doesn’t mean they were actually dozing, and only suggests they maybe aren’t paying attention or are stuck in a situation for which they are mentally freezing up and don’t know what to do.

I earlier pointed out that these pileups seem to happen predominantly during the winter months.

In theory, the advent of multiple vehicle collisions could happen at any time of the year. There are instances of these events happening during the summer months, especially when a lot of vehicles are on the roads and traveling at high speeds. Nonetheless, the winter is a particularly high chance of having such multi-car catastrophes.

Why so?

The obvious aspect is that the weather provides the “perfect storm” of stoking the elements underlying a large-scale pileup.

Once the freeway has become icy, the ability to control a vehicle diminishes. Add snow flurries to the equation and you get reduced visibility and ergo can no longer see the road as clearly. The driving scene turns into a problematic scenario.

The matchstick that can ignite the pileup consists of driving at unsafe speeds for the driving conditions involved.

First, let’s conceive of the idealized setting that won’t lead to chain reaction results.

Imagine an icy freeway that is encountering snow squalls. All the traffic consists of vehicles driving at super-duper slow speeds, cautiously crawling along. Each driver is carefully monitoring the distance between them and the vehicle ahead of their car. At all times, the drivers in this scenario are aware of the impending danger of a multiple vehicle smasher and so are conscientiously driving with great aplomb and rapt attention.

If one of the vehicles starts sliding precariously, the closest vehicles to this prospective pileup starter can react proficiently and avoid hitting the igniter of potential disarray. Or maybe they do lightly strike the vehicle, but at a slow speed that entails almost no demonstrative damages or injuries. Meanwhile, the other vehicles nearby have all reacted quickly and come to a stop, preventing a cascading chain reaction.

Your normal everyday commute probably is similar to the aforementioned scenario.

There are somewhat singular or unitary car crashes that happen daily on freeways and highways (I mean when two cars collide), yet they do not turn into massive car pileups. This avoidance is typically due to other nearby drivers able to react sufficiently to prevent the cascading impacts. Sure, it could be that a lot of the cars have to hit their brakes suddenly and tires are squealing, but fortunately, this happens in a manner that fewer slip-or-slides occur around and into other surrounding traffic.

It is like setting up a bunch of dominos.

When you place dominos stacked one after another, close to each other, you can readily get them all to fall down by simply starting the process at the start of the line. Touching the first domino and getting it to fall will strike the next one, which strikes the next one, and so on.

We’ve all done that before.

If there is a gap between one domino and another, when the one ahead of it falls backward, the chain is essentially broken such that the next in sequence domino does not fall. In a sense, that is what happens when drivers are paying attention and driving at safer speeds in the conditions of the roadway.

One aspect of multiple car pileups that you might not be aware of involves the possibility of human injuries and death. We already can visualize the car damage to fenders being bent and crumpled car doors. What doesn’t necessarily come to mind is the human costs too.

The insidious aspect is that you can get hit not just once, but potentially a multitude of times. Perhaps the car behind you rams into your car. That’s one blow. A car behind that other car rams into the car that just hit you and the forces carry once again into your car. On top of this, your airbag may have already deployed. This means that for the subsequent blows, your airbag is no longer especially useful since it is already deployed and now is partially deflated.

I don’t want to linger on these dreadful aspects and will merely also point out that it can be hard to extricate yourself from your mangled vehicle. There is also the chance that gasoline leaks will occur, and the spilled fuel could be ignited by nearby sparks or flames. Into this horrid convergence is the other nightmarish aspect that the ambulances and paramedics might not be able to rapidly reach the vehicles since the whole entanglement is somewhat impenetrable.

One shudders at the whole predicament.

A massive pileup is assuredly a beast of a different kind and one that you never want to get ensnared in.

Shifting gears, the future of cars involves the emergence of self-driving cars. Some have asserted that the advantage of self-driving cars is that they will never get stuck in a massive car pileup. The logic of this contention is that the AI driving systems will be astute enough to avoid any such mishap.

That’s quite a bold claim.

Here then is a vital question to consider: Will the advent of AI-based true self-driving cars ensure that there will never be any multiple vehicle pileups?

Let’s unpack the matter and see.

Understanding The Levels Of Self-Driving Cars

As a clarification, true self-driving cars are ones that the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.

These driverless vehicles are considered a Level 4 and Level 5 (see my explanation at this link here), while a car that requires a human driver to co-share the driving effort is usually considered at a Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend, see my coverage at this link here).

Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different than driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable).

For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.

Self-Driving Cars And Roadway Pileups

For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task.

All occupants will be passengers.

The AI is doing the driving.

Let’s start this discussion with an important point that might get some pundits into a real tizzy.

Self-driving cars can get embroiled in a massive car pileup.

Yikes, that seems sacrosanct and goes against the prevailing wisdom about the revered safety aspects of self-driving cars. It seems like an impossible possibility.

We can quickly dispense with the false belief that self-driving cars will never get stuck in a multi-car or multi-vehicle pileup.

Here’s how it can happen.

A self-driving car is driving along on a freeway. A human-driven car up ahead comes to a sudden stop, perhaps trying to avoid ramming into a truck that has just jackknifed. We will give a benefit of the doubt to the self-driving car that it was maintaining a proper stopping distance and thus can come to a halt before running into the up ahead human-driven car (I’ll be taking back this assumption in a moment, but let’s go with it for the time being, for sake of discussion).

Wonderful, you say, the AI driving system has saved the day and not hit the car that has come to a surprising stop.

Whoa, hold your horses.

The human-driven car behind the self-driving car is caught off-guard by the sudden stop of the self-driving car. It then rams into the self-driving car. This shoves the self-driving forward, and the self-driving car now rams into the car that stopped when trying to avoid the jackknifed truck.

Meanwhile, other cars and trucks behind this mess are all ramming into each other, doing so in the usual and unfortunate traffic dominos fashion.

Could this scenario happen in the real-world?

Of course.

There isn’t anything about this scenario that is somehow rigged against the self-driving car. We are assuming that the self-driving car was driving in a perfectly legal manner, and it was dutifully trying to keep its distance from the traffic ahead. The self-driving car came to a prompt stop. The sensors detected that the car ahead was halting and the AI driving system did a tremendous job of bringing the self-driving car to a halt.

Yet, despite all this, the self-driving car was nonetheless enmeshed into the multi-car pileup.

I assure you, this can and will happen, whence we have a prevalence of self-driving cars on our roadways. Right now, there are so few self-driving cars that the odds of getting stuck into this kind of a situation are rather low. Envision that once there are thousands upon thousands of self-driving cars, and they are roaming all the time, likely 24×7, the chances of finding them perchance trapped into this circumstance is a lot higher than it is today (see my discussion about roaming self-driving cars at this link here).

Okay, so I hope this dispels the myth of the invincible self-driving car.

I realize that some pundits will be instantly up in arms and argue that this scenario is “unfair” because it presumes that there are human-driven cars in the traffic setting. In the minds of these pundits, they are envisioning a world of only self-driving cars, such that all human-driven cars are banned from the roadways.

That alone is quite a dreamy vision. First of all, there are 250 million conventional cars in the United States alone, and those cars are not going to overnight be junked merely to be instantaneously replaced by self-driving cars. The reality is that we are going to have many years of a mixture of both human-driven cars and self-driving cars, likely for decades to come.

Some have suggested that we ought to have roads that are dedicated solely to self-driving cars, thus avoiding having to come in contact with human-driven cars, but this is also a quite questionable proposition (it might be done in very narrow contexts, but this doesn’t seem economically feasible on any largescale).

All in all, you have to assume realistically that self-driving cars and human-driven cars will be on our roadways together, at times being best buddies and at other times finding themselves at odds with each other.

Now that we’ve covered the “proof of existence” that there will be self-driving cars that get embroiled in massive car pile-ups, we can turn our attention to other related facets.

One point worth making as to why self-driving cars will rarely be involved in such pileups is that the self-driving cars might not even be on the roads at the time and place wherein these pileups happen.

Here’s the story.

For the Level 4 self-driving cars, the automaker and self-driving car firms are supposed to define the Operational Design Domain (ODD) for their cars.

The ODD refers to the conditions under which the self-driving car will operate or be driven by the AI driving system. For example, an ODD might indicate that automaker XYZ has decided that their self-driving cars can only operate when the weather is sunny or maybe under light rain, but that once there is snow or heavy rain, the AI driving system will refuse to drive the car. This is for safety considerations in that the developers or the fleet operator have predetermined that the sensors and the AI driving system are not proficient in those kinds of adverse conditions.

Indubitably, this makes good sense.

The ramification is that when you hear or see these news stories of massive pileups that happen in those icy and snowstorm conditions, the chances are that a Level 4 self-driving car won’t be there, simply due to the aspect that the conditions were outside the defined ODD and the self-driving car “refused” to drive in such conditions (when I say that it refused, this is not a mysterious form of AI sentience, instead, the AI is connected to weather forecasts and has been programmed to not get underway in circumstances outside its ODD, or that if it is underway then it pulls over to a safe spot and waits for the conditions to become drivable, etc.).

In short, the self-driving car never got on the road to start with, and therefore it didn’t get ensnared into the pileup.

You might liken this to a human driver that looks outside the window of their house, sees that it is snowing, and proclaims they are not going to go driving since the roadway conditions are going to be rotten.

If a domino is not in the set of dominos that are stacked up to fall, it won’t be part of the cascading series.

Conclusion

You might be tempted to think that this showcases that those self-driving cars are going to be more astutely run since they will not be on the roads when the chances of pileups are highest.

Yes, and no.

When those trucks and cars get stuck in a pileup, is your first thought that all of those drivers, each and everyone, was completely foolhardy and out of their minds to be on the roadways?

It seems doubtful that this is the case. There are drivers in that setting that chose the inherent dangers of being on the roads versus the needed basis for being in their cars. Perhaps some were heading to work and had to choose between getting to work and losing their job if they didn’t drive in those conditions.

Many of those drivers likely had compelling reasons to take a chance and be on the roads.

The point is that if the car itself refuses to drive (again, based on programmatic facets), this does not particularly solve the problem. This would imply that those humans that believed they had a bona fide basis for getting on the roadways would no longer be able to do so, assuming that they were fully reliant on only using self-driving cars.

This is quite a conundrum.

Who shall decide that it is safe to be on the roads?

Is this a question to be ascertained by the individual human driver, or by some overarching system or process that decides this matter? You could argue that the government routinely will close roads or urge people to not drive, and thus a means to ensure this decree would be to prevent cars from getting on the roads at all, rather than relying on humans to individually decide to abide or not abide by such cautionary aspects.

A counterargument is that if this kind of driving the decision is going to be made by a collective, you will have human drivers that are going to categorically clamor that they will never give up their driving. They will vehemently say that the act of driving is a decision for the individual. You will take away their driving only when their cold dead hands are pried from the steering wheel.

Anyway, returning to the multi-vehicle pileups overall, one small point to be made is that one intriguing aspect of having self-driving cars enmeshed in pileups is that they will be handy “eyewitnesses” due to their sensors and sensory collected data. Oftentimes, it is very difficult to reconstruct what occurred in a massive pileup and figure out the sequence that led to the chaos. Via inspecting the data from the self-driving cars, it might be a lot easier to discern how things went awry.

Whether this detective-like facility is endearing would likely depend upon whether you were a human driver that wanted to prove your innocence or was trying to avoid getting nabbed as the troublemaker that knocked down the first domino.

AI as a tattletale will be good for some, and a form of angst for others.

Please watch out for those pileups and I hope none of you get entangled in one.



READ NEWS SOURCE

Also Read  AI Ethics Shocking Revelation That Training AI To Be Toxic Or Biased Might Be Beneficial, Including For Those Autonomous Self-Driving Cars