Which scares you more: Human error, or faceless robots?
You’ll eventually have to pick a side. Eliminating error-prone human agency from the control of two-ton vehicles is the obsession of Google, Tesla and the world’s biggest automakers. Tesla CEO Elon Musk has even gone so far as to suggest human drivers may eventually be outlawed, though he later clarified that his company would always welcome drivers.
Truly automated cars are nowhere near a reality today. But they will be one day, and so will the fireworks around all the biggest traditions in American society: freedom, driving, independence.
They’re also going to be a really big headache: insurance companies, departments of motor vehicles and law enforcement around the country aren’t yet ready. The world may not yet be ready.
A car that can pick you up
The promise of self-driving cars is always on its way but never quite there.
The Consumer Electronics Show in Las Vegas this year was dominated by Audi, BMW, Mercedes and others touting their plans for driverless cars. Google started hinting at partnering with major automobile manufacturers. Even Apple is rumored to be eyeing self-driving car technology.
Last week, Tesla CEO Elon Musk announced a major new software update that would bring a host of automated features to some of the company’s existing cars, including the option to have your car pick you up (as long as you’re on your private property); park itself; and drive for long stretches on highways.
Many publications excitedly declared that Tesla would be bringing “self-driving” cars to the road this summer. Tesla hinted that it could have done more, but instead stuck to a set of features only somewhat more advanced than those already available for the Mercedes-Benz S class and the Infiniti Q50.
Alexis Georgeson, a spokesperson for Tesla, declined to comment on how much the company’s plans were limited by regulatory limitations rather than technical limitations other than to stress that “there is nothing in our autopilot systems that are in conflict with existing regulations.”
“I think if we woke up tomorrow and all the regulations were in place to have autonomous vehicles, we could have functional autonomous vehicles in a matter of months,” says Karl Brauer, an analyst with Kelley Blue Book. “It will ultimately be held up by lawyers and legislation, not by technology.”
Looking at the hype, it quickly becomes evident that it’s much easier for most of the industry to be bold when there’s no firm release date.
Mercedes, for example, shows off fantastical renderings of how its “self-driving” F 015 vehicle could give drivers more leisure time, but the only year mentioned on the landing page is “2030.”
2030. Plenty of time.
An industry on ‘autopilot’
In early 2011, Sebastian Thrun got on stage at the TED conference and opened up about how the death of a close childhood friend in a car accident caused him to develop driverless cars at Google.
“Now I can’t get my friend Harold back to life, but I can do something for all the people who died,” Thrun said. “I’m really looking forward to a time when generations after us look back at us and say how ridiculous it was that humans were driving cars.”
Interest in driverless technology has only accelerated since that talk, but automakers choose their words carefully as they circle around regulations and liability concerns.
For the Mercedez S-Class, the semi-automated model that is actually on the market, the words “driverless” and “self-driving” are not used. The same is true of Tesla. While the media may describe its cars as “self-driving,” reps for the company stressed to us that it is not, echoing a point made by the founder.
“It’s important for us to differentiate autonomous driving from autopilot,” Musk said in one interview last year. “We’re not asserting that the car is capable of driving in the absence of driver oversight.”
The Tesla cars, Musk pointed out, are theoretically able to drive themselves from the parking lot, through local streets, on to the highway and to your destination; they just won’t anytime soon. You could potentially stop holding the steering wheel for long periods of time and just ride like a passenger, but as with other companies, Tesla is expected to ensure that drivers keep paying attention at certain intervals.
“They’re not using the word ‘self-driving’ because that could be something a plaintiff’s lawyer could point to if there was an accident,” says Brauer, the Kelley Blue Book analyst. If the car owner effectively has no responsibility in overseeing the vehicle on the road, then it’s easier to claim he or she has no legal responsibility or liability in court if that vehicle gets into an accident.
Who decides when driverless cars are safe?
The U.S. National Highway Traffic Safety Administration currently breaks down automated vehicles into five levels, with the highest level being the mythic driverless car of the future.
Even with the software updates, experts say Tesla’s cars will probably fall in the range of Level 2, meaning the car handles some functions for the driver, but doesn’t actually let the driver “cede full control of all safety-critical functions” in various situations.
“I would be astonished if anybody is ready to introduce a system that goes above that threshold in the near future,” says Steven Shladover, a research engineer at the University of California, Berkeley, who worked with California’s department of motor vehicles to figure out regulation for automated cars.
As it stands, just a handful of states, including California, Nevada and Florida have passed regulations that allow for testing driverless cars, but there is a long road ahead to legalize the general use of these vehicles for drivers at the state level. Ultimately, that regulation would need to be national in scope or it would render these cars useless on long-distance trips.
The single biggest issue holding up potential regulations, Shladover says, is deciding on a process for certifying that particular driverless cars are safe enough for public roads “in the absence of any specific standards on this topic.”
Should it be left to companies like Tesla and Google to certify the safety, or the state, or another entity?
“We are a long way from fully automated technology — maybe 15 to 20 years off,” says Brad Stertz, a spokesperson for Audi, airing on the conservative side. “The concern is laws written to address that future start well before research has been completed to get us there.”
The uncertain road ahead
Who is to blame in a crash between a self-driving car and a manual driving car?
Will car ownership decrease if self-driving vehicles take off? If so, who will actually be buying car insurance?
Does your age, gender or driving history matter for determining insurance premiums if the car drives itself?
Both State Farm and Progressive, two of the largest car insurers, say they’ve been thinking about the impact of self-driving cars for years, but their public statements remain decidedly vague.
“As connected and automated vehicle technology reduces or eliminates some risks that drivers face today, new risks are likely to emerge,” a rep for State Farm told Mashable. “We are focused on the big picture.”
Howard Mills, global insurance regulatory leader at Deloitte and former superintendent of the New York State Insurance Department, says broader adoption of driverless cars could benefit insurers in some ways. For example, fewer crashes might mean replacement parts get more expensive, allowing insurers to jack up premiums in those cases.
“I think you will probably see the evolution of different types of policies,” Mills says. “You may have to sign [a document] that you will only be a passenger in fully automated vehicles. Maybe in the future you won’t have to have a driver’s license at all.”
Law enforcement officials will similarly have to adjust to incidents involving automated vehicles.
“If you want to use any kind of driving laws, those are all aimed at the person operating the vehicle,” says Sgt. Sean Whitcomb, a public information officer at the Seattle Police Department, a city that Musk specifically name-dropped when talking about Tesla’s new autopilot features.
Drinking, sleeping or talking on the phone while driving may be offenses now, but not so much if the person behind the wheel isn’t actually driving.
“Driving under the influence means you are operating the vehicle,” Whitcomb says. “If the vehicle is automated, completely automated, then you are just a passenger.”
Most police officers haven’t given these issues any thought yet. Police departments from Milwaukee to Oklahoma City to Fort Lauderdale all told Mashable that there have been zero conversations about self-driving cars to date. As one official at a Nashville police department put it, “we’ll cross that bridge when we get there.”
– via MASHABLE