Last month, the National Highway Traffic Safety Administration (NHTSA) released proposed policy guidelines for the development and regulation of self-driving cars. Although the guidelines were the product of consultation with industry and state regulators, they are deliberately styled as preliminary, with input sought from the public about how to improve them. Given their preliminary status and the vagueness of many of the guidelines, it is probably more accurate to refer to the NHTSA document as a proposal for a policy rather than a policy itself.
Nevertheless, observers rightly treated the announcement of the NHTSA guidelines as an important indication that widespread use of self-driving cars on public roads will soon be a reality. Government agencies do not produce hundred-plus-page reports in response to science fiction. By comparison, the Obama administration explained why it rejected a petition to build a Star Wars-style death star in four sentences.
We can anticipate three phases of government policy regarding self-driving cars. In the first and current phase, government regulators working with industry leaders anticipate and look to head off problems, so that self-driving cars can be introduced and sold to the public. The second phase will be one of transition, as new driverless or semi-driverless cars share the road with the legacy fleet of manually operated cars.
This column addresses an issue that will arise in the third and final phase, when driverless cars dominate the roads: Should people who prefer to drive for themselves be permitted to do so? The NHTSA guidelines do not address this question, but in twenty years or so, state and federal regulators will need to. With progress on self-driving cars coming much faster than many experts anticipated, it is not too early to begin thinking about whether we should work towards banning human driving on public roads.
Why Ban Manually Operated Cars?
People fear new technology. Thus, although 2015 saw almost 40,000 Americans killed and over four million seriously injured on the roads, it was bigger news this past summer when the driver of a Tesla being operated in autopilot mode died.
To be sure, that is not a fair numerical comparison, because there are so many more cars on the road being operated by human drivers than by computer. Still, Tesla, Google, Uber, and other companies developing and testing driverless cars understand that to earn the trust of the public they must market technology that is not merely as safe as the cars currently being driven by humans but at least an order of magnitude safer.
What should happen when self-driving cars achieve overwhelming superiority in safety relative to human-driven cars? If a self-driving car is ten times less likely to get into a serious or fatal accident than a human-driven car, should government nonetheless permit humans to drive their own cars? If so, is there any safety ratio that would justify a ban? What if self-driving cars are a hundred times safer? A thousand?
Beyond the safety improvements we can expect from self-driving cars relative to human-driven cars, there are network effects to consider. Two self-driving cars at risk of colliding can communicate with one another in nanoseconds in order to coordinate accident-avoidance maneuvers. A self-driving car nearing a collision with a human-driven car can only move itself; the slower reaction time and uncoordinated decisions of the human driving the other car create a further safety disadvantage.
Thus, the argument for banning human-driven cars is really quite simple: It would save many lives and avert many more serious injuries.
Reasons to Drive Yourself
What sorts of reasons might nonetheless be invoked against banning human-driven cars? Consider a few.
Some people might want to drive themselves simply for entertainment. Go-kart racing and computer simulated driving games are popular because driving can be fun.
Other people might want to drive their own cars less as a form of thrill seeking than as a kind of retro chic. Just as today’s hipsters prefer typewriters to laptops, vinyl records to Spotify, and penny-farthings to lightweight bicycles with multiple gears, the hipsters of 2040 might derive ironic satisfaction from piloting their 2012 Hyundai Sonatas on roads mostly filled with self-driving cars. Even today, owners of antique cars occasionally take their Model A Fords out for a spin.
Still others might want to drive themselves in order to feel safer, even if doing so does not actually make them safer. A tiny number of exceptionally good drivers might even be right about their ability to drive more safely than the robocars—although as technology improves, their numbers would dwindle to zero.
Yet even if people’s feelings about safety do not correspond with actual safety, they still have an interest in driving themselves. Many people are afraid of flying, even though for any substantial distance it is a considerably safer mode of travel than driving. We nonetheless allow people to travel by car or bus because we recognize that involuntarily subjecting them to the anxiety that accompanies flying would do real harm. People with similar anxieties about self-driving cars would suffer like harm from a ban on human driving.
Weighing the Competing Concerns
How should regulators trade off the additional safety that a human-driving ban would achieve against the real and perceived benefits that some people derive from driving their own cars? If we apply simple cost-benefit analysis, it would seem that the benefits of banning human-driven cars from the roads outweigh the costs. Tens of thousands of annual saved lives and millions of averted serious injuries pretty clearly outweigh not just the forgone satisfaction of future retro hipsters but even the more substantial harm of anxiety for the robophobes.
But is simple cost-benefit analysis the right tool? Our law has a libertarian streak. It generally permits people to engage in risky activities for entertainment and other purposes, even when policy experts think that the risks are not worth taking. Quiet reading and chess are safer than skydiving and mountain climbing, but we don’t ban the latter activities on the ground that people can entertain themselves in safer ways.
Indeed, we don’t even require that everyone drive a Volvo. If you want to save money or trade safety for performance, you can. In general, the law requires that cars and other products be “safe enough,” not that they be as safe as the safest car (or other product) on the market.
Are today’s human-driven cars “safe enough”? They are for us, but that does not mean that they should be regarded as sufficiently safe in a couple of decades, when self-driving cars have changed the baseline. Safety equipment improves over time and when it does, prudent regulators raise the standards. Prior to the 1940s, few cars on the road had turn signal lights, but no one today would argue that it should therefore be permissible to sell or drive a car without them—even though hand signals can be used as a substitute.
The case against regulation is strongest when people seek the freedom to take risks for themselves but not for others. Even then, however, paternalism sometimes wins out over libertarianism. Moreover, many ostensibly self-regarding acts have social consequences. The adult who dies because he exercised the freedom to ride a motorcycle without a helmet (as permitted in some states) might orphan his children at great psychic cost to them and financial cost to the state.
Seen in this light, the case for eventually banning non-self-driving cars is especially strong. There is at least a comprehensible libertarian argument that adults should be able to decide for themselves whether to wear helmets or buckle their seatbelts, because these safety devices protect the users themselves. By contrast, many of the expected safety gains of self-driving cars will accrue to people riding in other vehicles.
Some day in the not-too-distant future, the claim that people should be permitted to drive their own cars will sound as implausible as the claim that people should be permitted to drive drunk.