Would you trust being driven in a self-driving vehicle?
One of the concerns that some of you have expressed on our Facebook page [1, 2, 3] about self-driving vehicles such as Olli revolve around vehicle safety and more specifically, ethical dilemmas when an autonomous vehicle has to make a split-second decision to maintain overall vehicle safety. I want to preface that my opinions here are designed to get us thinking critically about some of these concerns so that we can improve safety and survivability for all scenarios of autonomous driving.
Researchers from MIT have set out to learn about people’s preferences for how they would expect an autonomous vehicle to behave in a catastrophic situation. The Moral Machineasks the user to choose between two different scenarios faced by an autonomously operated vehicle. The complex outcome of these decisions and why the user has made them is daunting even to fantasize about when it comes to choosing between life and death for passengers or pedestrians. After judging 13 scenarios, you’re presented with your results and how they deviated from the average by other participants.
I don’t deny that this survey is a neat tool to measure rationalizations for who should be the beneficiary of crash-avoidance maneuvers. This survey functions as a powerful tool to spark a great conversation, which is one of the project’s stated purposes. However, there are two reasons I don’t subscribe to the presumption that every outcome results in one or more fatalities. First, I don’t think autonomous vehicles need to operate faster than their onboard technology can reasonably keep up with. Second, I don’t believe every potentially hazardous scenario has to result in one or more fatalities.
Low-speed autonomous vehicles are the place to start
It’s not necessary for driverless vehicles to go 0-60 in 2.2 seconds or to travel more than 75 MPH. Autonomous driving technology is rather nascent; it could perhaps keep up with a driving environment moving at 35 MPH or less. As the technology is developed and becomes proven, manufacturers can certify and ratchet up the speeds as needed. The industry has already seen success in the form of adopting automatic emergency brakingstandards by 2022.
When a vehicle moves at a slower speed, it’s able to make better decisions over a longer period. In situations where an impact is inevitable, the vehicle can take evasive action sooner to protect both passengers and third parties alike. Think about how many objects you can remember when you’re cruising down the highway at 75 MPH versus an urban environment at 35 MPH. Consider the fact that autonomous vehicles are designed to maintain full 360° visibility, unlike human’s near-180° field of view.
As with any new technology, self-driving vehicles will have defects as they’re being developed. We can’t dismiss potential risks with driverless vehicles. However, we can engineer them to be safer, smarter, and more consistent than human drivers. (Not to mention that with additive manufacturing, we can make cost-effective iterative improvements for future vehicles, too.) And while some legitimate security concerns may linger, security researchers are eager to test and propose updates to strengthen vehicle security to prevent unwanted vulnerabilities.
Human drivers are prone to error, sometimes highly so
There’s a common perception that human drivers operating vehicles are innately better (or more ethically sound) than autonomous vehicles. Hard data begs to differ.
In 2014, NHTSA reported that there were 3,179 people killed in the United States from vehicle crashes involving distracted drivers, representing 10 percent of all the motor vehicle fatalities seen that year. Out of those fatalities, 520 non-occupants such as pedestrians and bicyclists that have been killed as a result of distracted driving. It’s not only the deaths that count; NHTSA estimated 431,000 people were injured due to distracted driving.
These figures make a strong case that human drivers are not infallible in their driving habits. We can create legislation to mandate the use of hands-free cell phones while driving, and public service announcements to discourage phone usage while driving, but it isn’t making a dent in our behaviors. I didn’t even touch on intoxicated driving, either.
We have a problem and we ought to confront these with practical solutions that save lives.
Developing autonomous vehicle technology is ethical
For us to hold autonomous vehicles to an ethical standard, we have to understand that ethics are a set of principles defined by a moral code. Are we being intellectually and morally honest by impeding the progress of developing autonomous vehicles? I don’t think we should be blind to the risks, but we have a very real problem to solve. We owe it to ourselves to improve the odds of survival for more than 30,000 people every year in the United States.
Innovation in driverless cars is legitimately disruptive, and that goes against what we’re used to in transportation. We’ve become cozy with incremental progress in today’s cars. Add an airbag here, add another airbag there, and add blindspot alerts and we have only seen marginal safety improvements to show for it. We have the opportunity before us to remove the largest causal factor in automotive crashes: us. This is a scary thought for some because it would eventually relegate human-controlled driving to closed-course tracks and become a hobby, not a rite of passage into adulthood.
One ethical aspect of the industry that needs improvement is how we describe and set expectations in the eyes of consumers. Additionally, we need to do a better job of educating consumers about autonomous vehicle technology so they are informed about how the vehicle operates and what their responsibilities are. I’m confident that once consumers understand the immediate and long-term benefits of self-driving cars, they’ll give them the green light.
If we avoided developing new transportation solutions for fear of the potential risks, we’d still get around riding on the back of exactly one horsepower. The moral cost for not positioning autonomous vehicles for success is something that not too many of us think about — but maybe you have now.
Do you support or oppose the idea of having autonomous vehicles on the streets? I’d love to learn why you think so in the comments below.