Should driverless vehicles break the law to increase safety?
The cars are twice as likely to get in an accident, in part because they drive the speed limit.
Justin Fine/AP
Driverless vehicles penchant for closely following the rules of the road make them twice as likely to be involved in an accident.
That's according to a study released in October by the University of Michigan Transportation Research Institute which found that self-driving vehicles may have a higher accident rate than conventional vehicles, though they were not found to be at fault for any of the crashes.
Injuries involving driverless vehicles are likely to be less severe than those involving vehicles operated by humans because they tend to drive at or below average speed, the study also found. In November, a California highway police officer pulled over a driverless car traveling 24 mph in a 35-mph speed zone.
This week the California Department of Motor Vehicles recommended banning driverless automobiles from operating without a human behind the wheel, even though autonomous cars are not yet fully active on the roads.
A debate is now brewing among the vehicles' designers over whether to teach the cars to commit minor traffic infractions in order to better meld with the vastly more predominant human drivers that don’t tend to follow all the rules.
The study looked at data from three major driverless vehicle producers – Google, Delphi, and Audi – conducting an initial analysis on the automobiles’ road safety records and comparing that information to the records of conventional vehicles, based on 2013 statistics, according to the university.
Researchers noted that tests on driverless vehicles generally take place under more favorable weather conditions and of a shorter overall distance of 1.2 million miles versus 3 trillion miles driven by conventional vehicles. They also corrected for underreporting of car accidents in which no one is killed.
Raj Rajkumar, a director of Pittsburg’s General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab, said that for now companies would continue to follow the speed limit.
“We end up being cautious,” Mr. Rajkumar told Bloomberg. “But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you.”
Google has stuck to the idea that a driverless car should go the speed limit and counters claims that makes them unsafe, even as the development arm of the company moves to construct them to be more assertive on the roads.
Brandon Schoettle, who co-authored the study, said driverless vehicles do sometimes react to situations in ways surprising to human drivers, sometimes making awkward decisions like “stopping in a situation or slowing down when a human driver might not.”
"They’re a little faster to react, taking drivers behind them off guard,” he said.
Many accidents involving driverless vehicles happen when human drivers misinterpret their moves, striking them from behind. But because the vehicles follow speed limits, many accidents tend to result in minor injuries.
“They’re a little bit like a cautious student driver or a grandma,” said Dmitri Dolgov, principal engineer for the Google program, to Bloomberg, adding that his team is working on a more fluid reaction for the driverless cars.