Transportation

Self-Driving Cars Can Also Self-Design A Whole New Traffic Code


There has been much debate about how to regulate safety and the initial operation of self-driving cars, and how to even tell how safe they are. Our current “rules of the road” govern by safety and traffic flow. They have been built by observing all the ways in which human drivers can’t be trusted to be safe and cooperative on the road, then passing laws forbidding those, and sending out police to catch and punish offenders.

There are a billion drivers, each a different entity, so the use of the law makes sense. But there will never be more than a handful of robocar driving systems in any given area, and probably not more than a hundred or so world-wide. Unlike the human drivers, it will be possible to get representatives from each robocar system in a town or nation in one room at the same time. There, it will be possible for them to discuss, with themselves and with regulators, what the right rules are. Once they are agreed upon, they can also be enforced directly with those entities.

Most of the rules of the road break down into these two goals

  1. Be safe
  2. Share the road (ie. do not unfairly impede others)

There are a variety of local regulations for specific streets, such as declaring one-way streets and parking zones, though usually these are created in order to support the two goals in special ways on certain streets.

This creates the potential for a vastly simpler vehicle code. One could declare such principles, and then have a sort of court or ruling body that can determine if a particular practice violates them. With a ruling given, all would implement it. If anybody didn’t, it would quickly become apparent, and enforcement could be applied as needed directly with the developer.

Game theory teaches a lesson

There is the potential for even more though, something which can happen largely in the absence of regulators. Today, we must expect that any given individual driving the roads will be be selfish. We can even expect that any given brand of robocar might drive selfishly as well. But for a group, there is a way to stop that, and it comes to us from the field of game theory, and its most famous problem, “The Prisoner’s Dilemma” and its top solution, known as “tit for tat.”

We can effectively strengthen the 2nd principle to include a new concept of “Give, and you will receive more in return.”

If we suppose that we have the developers of all the robocars in an area in the room, they can discuss how they can cooperate. In the Prisoner’s Dilemma, the problem is that while cooperation is a win for both sides, if one side “defects” they get a bigger win in that particular circumstance, which makes defecting “smart.” If we take turns on the road all the time, everybody wins, but if one person cuts everybody off, they “win” but everybody else loses more. Everybody is better off if you can all agree to cooperate.

Researchers discovered that when you have multiple encounters with the chance to cooperate or not, the overall winning strategy is one named “tit for tat.” It means you cooperate by default, and presume others will, but as soon as somebody doesn’t cooperate, you remember that, and you (and those allied with you) don’t cooperate with the defector again, at least until they learn the lesson.

To do this you examine the best solutions to any problem of how to share the road and find the one that is the best win for everybody. Everybody implements it, but if, while driving, they notice another car that won’t cooperate, they can note what type of car that is and share that record. After that, nobody in the “club” will cooperate any more with the defecting type of car until it cleans up its act. That means that if Tesla
TSLA
, Cruise, Zoox, Waymo and EvilCar are all driving a city, and an EvilCar cuts off a Cruise in violation of the cooperation agreement, then not just all Cruises but also all the other cars will no longer play nice with EvilCars. That one brief victory for that one EvilCar would be followed by a permanent nightmare of an unfriendly road for all EvilCars. Which means that EvilCar would be crazy to do that and never would.

Everybody cooperates and takes the optimal path to share the road and improve safety. With no law requiring it. As long as your fleet is not the majority of cars on the road, you are crazy to do anything else.

The renegade humans

Robocars must share the road with human drivers, who still need a vehicle code and aren’t in the room to discuss how to cooperate. While the robocars can’t mete punishment on all human drivers for one human’s selfishness, they can read licence plates. As such, when a human driver does something clearly non-cooperative, or illegal under the human vehicle code, they can be remembered. After a certain number of selfish moves, a fleet could decide to no longer cooperate by default with drivers of that car. It need not even share that with other fleets — losing the good will and cooperation of a single fleet would be enough to make a person regret it.

Not that the robots would do anything illegal. They would just stop being friendly and cooperative. They would stop yielding. Stop making room to let the vehicle in the way cooperators do, knowing they will get room made for them by another cooperator. Learning that this had happened to them, the human driver would go online and promise to be better, and join the group of cooperators. Oh, the human won’t be perfect about it, but if they do a good job and become nice again, others will be nice to them.

None of this requires special radio communication between the cars. It will all be figured out the same way it is today. Each car will look at the situation and perform “golden rule” calculations and do what its programmers would want other cars to do if the situation were reversed, or rather an agreed upon calculated balance. There are a thousand courtesies that can be worked out, then refined and improved at regular meetings. Infractions can be uploaded and examined to see if they were reasonable in the light of more data, or bad intent, and it can be fixed both ways. While the robots will implement the courtesy, it will be humans figuring out what courtesy means.

Here are some behaviors which could go away:

  1. Two lanes are merging. Good cars occupy both lanes and zipper merge before the join point. Bad cars jump ahead past that merge point to try to barge in.
  2. Traffic is heavy. Bad cars divert off onto side-routes to skip a bit of traffic, then cause the very jam they were trying to avoid by pushing back into the main route further along.
  3. A car jumps ahead of the proper order at a 4-way stop, or makes a “Pittsburgh left” by jumping the gun on the green.
  4. Cars don’t make a gap for merging cars to enter a lane
  5. Cars on a side street can’t turn due to heavy traffic, nobody lets them in.
  6. Cars cut in front of others when there is a small gap, forcing the car they cut off to brake.
  7. Making frequent lane changes when there is not a stoppage in their lane.
  8. Staying in a passing lane while electing to not move faster than the cars to be passed
  9. Tailgating (but not speeding, that’s handled elsewhere.)
  10. And many more..

This system generates a road where everybody is working together, seeking the strategies which produce the best traffic flow. Or whatever rules we like including…

Exceptions and emergencies

As noted, if a vehicle drives selfishly, it would get tagged as a defector. For a human, this might not happen with the first offense, but it would eventually happen. It could also be possible for any vehicle — robot or human driven — to decide there is an emergency, and drive selfishly. After it was done, if it got a lot of reports of selfish driving, the person who declared the emergency could demonstrate that there really was an emergency and not get marked as a defector.

Beyond this, a person could declare an emergency or even urgency to a database that the robocar fleets follow and distribute among their cars. In that case, the other cars might even go out of their way to help you, getting out of your way almost like you were an ambulance once they spotted you or knew you were coming. (No car to car communications here, this would all be in the cloud.) Of course, they would only do this if you somehow had earned the right to do it. Perhaps after a year of good cooperation, for example. And you certainly couldn’t do it very often. It would be a trade — others would get out of the way for you, and you would be expected to get out of the way for many of them, if you are a cooperator. Of course, money could even be offered, though there are issues with letting the rich have better rights on the road than the poor. Many systems are possible — some good and some bad — but for the first time because there has never before been a chance to trade, no reward for cooperating.

Regulators

As noted, all of this can exist without regulation, except for one thing — those not agreeing to cooperate. The cooperators could not be allowed to cooperate in a way that penalized others unfairly. This should not be one group of cars simply ganging up on other cars due to the strength of their numbers. So even if some cooperators agreed to get out of the way of somebody in a special hurry, they could not do that in a way that slows traffic for people who aren’t getting a benefit later.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.