According to Elon Musk in addition to a host of automotive and transportation experts, autonomous vehicles were supposed to be a familiar feature of the modern-day cityscape by now.
Such ambitious predictions have since proved to be significantly premature.
The autonomous vehicle hype bubble appears to have burst around 2015 and many in the industry are responding, not by giving up, but by acknowledging the hard yards and steep learning curve that lies ahead.
A question of “when,” rather than “if,” still appears the most pertinent to be directed at a driverless future and one man dedicated to problem-solving within a single, but hugely important corner of the autonomous vehicle landscape is Dr, Nicholas Giudice.
Giudice is a Professor of Spatial Informatics at the University of Maine’s School of Computing and Information Science. He is also a founder and Chief Research Scientist at the University’s VEMI lab.
The institution houses the Autonomous Vehicle Research Group focusing on accessibility considerations in new and emerging transport technologies.
As a blind guide dog user, Giudice is uniquely positioned to appreciate how meeting wider access needs can help win public confidence in what remains a fledgling technology, while also ensuring that people with disabilities do not get left behind.
He feels strongly that, in contrast to most technological trends, it is the older generation, particularly those who have some manner of physical impairment, who hold the most promise in driving the early adoption of autonomous vehicles.
“Older drivers are far more likely to be in accidents due to age-related impairments and taking their license away is always a huge deal,” says Giudice.
“If they are being told that they may have to lose their license or they are scared because they nearly hit someone, then suddenly, autonomous vehicles seem like a very attractive alternative solution. This is what we are understanding from our initial canvassing,” he continues.
Further adding, “Autonomous vehicles could find more acceptance amongst older drivers due to having that extra motivation. It’s funny somehow because we don’t commonly think of older people as early adopters of technology.”
The long and winding road ahead
Though the promise of self-driving vehicles is alluring on so many levels, consumers still need to overcome the cognitive dissonance of sitting side-on to an empty driver’s seat and a steering wheel moving by itself.
Big Tech, on the other hand, is still grappling with America’s uneven geographical and regulatory landscape in addition to the significant ethical and technological challenges of operating autonomous automobiles alongside impulsive and unpredictable human drivers in legacy vehicles.
This is perhaps why Google continues to restrict its Waymo ridesharing service to a small, comprehensively mapped geographical area of Phoenix Arizona and Uber chose to sell off its self-driving division to Aurora back in January in a deal worth around $4 billion.
Accessibility for some could mean usability for all
One factor that could prove a boon for passengers with disabilities is that the novelty of the technology can be seen as a great leveler.
A general sense of unfamiliarity and perception of risk may be shared by able-bodied and disabled passengers alike, and the means of addressing this is common to both groups.
Giudice believes the key lies in communication through effective vehicle-passenger collaboration.
“There is still a big trust problem at play with autonomous vehicles,” says Giudice.
“Passengers experience a loss of control and don’t know what’s happening in the black box.
“But I think a lot of the trust problem is actually just a knowledge problem of understanding how it works. So, something like the car telling you that it’s going to switch lanes because there’s a vehicle slowing down ahead, or it’s stopping because there’s a tree branch in the road.”
He continues, “If we don’t expose the black box, if we don’t tell the human why the vehicle is doing what it’s doing, then people are going to be nervous.”
In some ways, equipping ridesharing fleets with standard accessibility hardware and software such as wheelchair spaces, ramps, easily operable door handles and safety belts and a selection of both audio and visual interfaces, is the simpler side of the accessibility equation.
Giudice believes far greater complexity lies in enabling the onboard AI to possess sufficient situational awareness to adapt to a rapidly-changing dynamic environment, especially for passengers likely to require additional levels of information due to sensory impairment and where safety is at stake.
“The vehicle will need to be monitoring exactly where the passenger is sitting, particularly as many will be side-on in autonomous ridesharing,” says Giudice.
“If it says, ‘Your destination is on the right’ or ‘Get out on the right’ that only makes sense if it knows the passenger’s orientation. If I get confused and get out into traffic — as a blind person that’s extremely dangerous. So, the language has to be very precise.”
Further adding, “People need to think about the entire trip and that’s not yet happening. It’s also about summoning and locating the vehicle, the journey itself and then what to do at the destination. Those are several distinct processes involving different components of accessibility.”
“Ultimately, to give passengers with disabilities a safe and efficient ride experience, we will need to be using a far more powerful AI than a simple Amazon Alexa type.
“One that is capable of deploying computer vision to assess complex environments and react with smarter, more dynamic conversations,” he says.
A question of application
Another approach to making accessibility provisions in autonomous vehicles more personalized, consistent and familiar to passengers with disabilities is to leverage many of the existing accessibility features in the rider’s smartphone.
This is precisely what Giudice and his team at The VEMI lab have attempted to do in creating AVA – the Autonomous Vehicle Assistant.
AVA, which was awarded a $300,000 research development prize earlier this year by the U.S. Department of Transport is an app-based system intended to help disabled passengers negotiate all stages of an autonomous vehicle journey based on ridesharing.
Developed in conjunction with Colby College and Northeastern University, the app can help integrate access needs directly into the booking system ensuring that passengers are seamlessly served up the appropriate accessibility provisions from the get-go.
AVA will also deploy haptic feedback and make use of Augmented Reality to help passengers with low vision locate their vehicle, operate the door handles and disembark safely by highlighting the appropriate components on the phone’s screen.
As for the day when autonomous vehicles might move away from specialist use cases such as long-haul trucking and become more mainstream — Giudice admits he has no crystal ball.
“Right now, the technology is way ahead of the policy,” he says.
“There’s this patchwork of regulation and legislation around autonomous vehicles across different states.
“If you ask me when it’s going to happen — I think, certainly by 2025, as a blind person, I would be completely capable of operating a vehicle to make a trip by then. Whether it will be legal is a different matter.”
As the arrival of autonomous vehicles would herald the greatest evolution in transportation since the invention of the automobile itself, it is understandable why progress appears slow and tentative.
This long period ahead for fine-tuning the technology at least offers ample time to evaluate and bake in accessibility from the outset.
More than this, vehicle and system designers should already be aware that autonomous driving is an emerging technology that many in the general population remain nervous about participating in.
If the usability bar can be set extremely high from the outset in ensuring that vehicle and system design account for the most extreme impairment use cases, public hesitancy can be readily assuaged and a lot more people brought along for the ride.