Tesla reins in Autopilot after 'fairly crazy' drivers misuse self-driving tech

Tesla chief Elon Musk says the company will restrict the Autopilot feature in its Model S electric car so that people will use it more responsibly. Even so, the company says Autopilot has already averted lots of accidents and caused none.

Tesla CEO Elon Musk says the Autopilot self-driving software has averted many accidents already, but that people are doing "fairly crazy" things with it. Here, a Tesla electric car is seen at a showroom in Cincinnati, Ohio on November 3, 2015.

Al Behrman/AP

November 4, 2015

Self-driving cars are here. They’re not just being developed in labs or tested on closed courses – partially autonomous vehicles are cruising down public highways right at this moment, carrying people to and from work. And though the shift from cars driven by humans to cars driving humans has barely begun, it’s gone pretty smoothly so far. 

During a quarterly financial call on Wednesday, Tesla chief executive officer Elon Musk said his company’s introduction of self-driving features to its Model S electric car has been mostly successful. The self-driving mode, known as Autopilot, was made available to Model S owners via an over-the-air software update last month, and is currently active in about 40,000 vehicles.

“We're very aware of many accidents that were prevented by Autopilot, and not aware of any that were caused by Autopilot,” Mr. Musk told reporters on the call. (A video of one of those prevented accidents went viral last week: Hundreds of thousands of people watched dash cam footage of a Tesla automatically braking when a car veered into its lane near Seattle.)

Ukraine’s Pokrovsk was about to fall to Russia 2 months ago. It’s hanging on.

Still, Musk said, the company is aware of some “fairly crazy videos” of people using Autopilot irresponsibly, such as activating it on twisty roads that the software isn’t yet ready to handle.

“We will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things,” Musk said, according to CarThrottle. That probably means that Autopilot won’t be able to be activated if the driver doesn’t keep at least one hand lightly on the steering wheel. Tesla suggests that drivers keep their hands on the wheel at all times, but the Autopilot software doesn’t currently enforce that suggestion.

Musk conceded that Autopilot isn’t perfect right now, but said that it will get better over time.

“It was described as a beta release. The system will learn over time and get better and that’s exactly what it’s doing,” he said on the call.  “It will start to feel quite refined within a couple of months.”

Most people are used to the open-beta model for software, in which a product ships in a not-quite-perfect state and is updated over time based on user feedback. (Gmail was famously left in “beta” stage for more than five years as developers refined the software based on how people were using it.) But this iterative approach has only recently been applied to autos, and Tesla and other automakers who offer partially autonomous features for their cars have to make sure the software is safe, even in its early stages.

Howard University hoped to make history. Now it’s ready for a different role.

So far, accidents involving self-driving cars have mostly been the fault of human drivers, which suggests that drivers can be reasonably confident about letting their cars take the wheel.