Human drivers will probably bully self-driving cars
In the dawning age of artificial intelligence–like the kind found in self-driving cars–we may begin to bully software, a new study suggests.
Edgar Su/Reuters/File
Bullying has recently become a hot topic of conversation, particularly in schools. But of course, it's been an issue for decades, if not millennia, probably because it's human nature to pick on others who are seen as weak or different.
Sadly, a new study suggests that we may not limit our bullying to other humans. In the dawning age of artificial intelligence–like the kind found in self-driving cars–we may begin to bully software.
The study was conducted by Goodyear and the London School of Economics. Researchers polled roughly 12,000 respondents in 11 countries, asking them how they felt about autonomous vehicles. Similar questions were posed of participants in study groups held throughout Europe.
The data is in line with that derived from other studies, which have shown that motorists are very concerned about self-driving vehicles. Those surveyed admitted that human error is to blame for most auto accidents, and that, in theory, autonomous cars would be safer. However, nearly three-quarters of respondents--73 percent, to be exact--were afraid that self-driving software could malfunction, leading to Terminator/I Robot/2001-style catastrophes.
Just as interesting is the fact that 60 percent of those surveyed worried that "Machines don’t have the common sense needed to interact with human drivers." They'll stop at every yellow light and they'll wait their turn at every intersection--which, in case you haven't noticed, isn't the way that most of us drive.
At least some respondents suggested that they'd find autonomous cars so frustratingly lawful that they might bully them in traffic. As one UK participant explained: "[The autonomous vehicles are] going to stop. So you’re going to mug them right off. They’re going to stop and you’re just going to nip round."
In the end, researchers found that respondents viewed autonomous vehicles more favorably the longer they thought about them. However, it'll take more than reflection and statistics to put drivers at ease. As the study organizers explain:
"Arguments that focus simply on promoting greater safety, lifestyle enhancements or economic efficiencies will not gain traction if [autonomous vehicles] do not fit comfortably into the public’s picture of what the road should be like for them to drive on."
Does this mean that programmers should design autonomous cars to drive like jerks? (Please, no.) Or should we modify our own driving behavior to be more lawful and patient? (Not very likely.)
This story originally appeared on The Car Connection.