Could updates fix Tesla's Autopilot woes?

Tesla head Elon Musk says that an over-the-air software update could lead to 'significant improvements.'

The radar technology of a Tesla Model S containing Autopilot features is pointed out during a Tesla event in Palo Alto, Calif., in October. Tesla chief executive Elon Musk said on Twitter Sunday that a planned over the air software update could provide "significant improvements" to the feature, which has been faulted in one fatal and two other accidents involving Tesla's cars.

Beck Diefenbach/Reuters/File

July 18, 2016

As a controversy around Tesla’s Autopilot feature continues to ramp up, chief executive Elon Musk is attempting to reassure owners that the partially autonomous system could receive “significant improvements.”

After what he called a “promising” call with Bosch, the German auto parts company that makes that radar system that Tesla’s cars use to detect obstacles, Mr. Musk said on Twitter said that a new over-the air software update could be coming soon.

While a slew of carmakers and tech companies have increasingly been focusing on developing self-driving vehicles, Tesla’s woes come at an early, critical time for the technology.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

Questions about Autopilot have mounted in the wake of a fatal crash that killed a driver in Florida using the feature in May, prompting an investigation by federal regulators. Two other crashes have also been reported, with the driver in one nonfatal crash in Montana saying he did not receive any warnings that he needed to resume manual control of the car.

“I don't know the details behind the Tesla strategy, but they are definitely attempting to do a lot while simultaneously balancing safety and introducing these new features,” says Brandon Schoettle, a project manager at the University of Michigan’s Transportation Research Institute, in an email to The Christian Science Monitor.

“We feel that some formal testing should be done by all manufacturers before they become available to the general public," he adds.

In a New York Times column earlier this month, Mr. Schoettle and Michael Sivak, a research professor at Michigan’s Transportation Research Institute, argued that self-driving cars should face similar licensing tests as human drivers do.

But, Schoettle tells the Monitor, it’s unclear what impact an over-the-air update will have on the Autopilot feature. “If a software patch will fix the problem, then over-the-air updates would be a big help in getting fixes out immediately.  But if an upgrade requires different hardware or a combination of software and hardware updates, then over the air will not work in those situations,” he writes.

Howard University hoped to make history. Now it’s ready for a different role.

Last fall, in the wake of videos showing some people using the Autopilot feature irresponsibly, including on twisty roads, Musk announced an over-the-air update that that would put some “additional constraints,” on when the feature could be used.

Now, he appears to be less fully embracing criticism of the feature, including calls from Consumer Reports magazine to disable it by default. Soon after discussing the software update, he tweeted about a recent poll – the source was unclear – suggesting that “0.0 percent" of Tesla owners want the feature disabled.

In 2014, one person died for every 100 million miles driven in the US, according to the National Highway Traffic Safety Administration (NHTSA). The accident on May 7 was the first fatality in 130 million miles of driving on Autopilot, says Tesla. "So your odds of remaining safe on the road are actually better in an autonomous vehicle than with a human behind the wheel," writes Vanity Fair's Nick Bilton.

But one consumer group argues that Musk’s stances are particularly concerning because Tesla has previously touted its focus on safety.

You want to have it both ways with autopilot. On the one hand, you extoll the virtues of autopilot, creating the impression that, once engaged, it is self-sufficient,” officials from the group Consumer Watchdog wrote in a letter to Musk earlier this month. “On the other hand, you walk back any promise of safety, saying autopilot is still in Beta mode and drivers must pay attention all the time.”

But Musk seemed to rebuff those concerns in a series of Twitter posts, referencing a post by an anonymous user on a Tesla blog that questioned the claim that disabling the feature by default and halting beta releases to the public of the technology was necessary.

“Do you know of any automotive lab that emulates every single road condition? Is it even possible to create one??” the user wrote. “Google has been trying to collect real life data from its own Level 3 [partially autonomous] cars on public roads. However, that approach has been a slow process, does not collect sufficient data and delays the significant advantages of autonomous driving.”

Carmakers have often grumbled that further regulation could harm the progress of self-driving technology, particularly if each state creates its own rules.

But Schoettle, the Michigan researcher, says further guidelines from NHTSA would be a natural fit to regulate self-driving cars because the agency sets existing safety standards. The regulator is also investigating the fatal Tesla crash in Florida.

“In keeping with that tradition, regulation for self-driving vehicles should dictate the performance requirements of a particular system, and allow the manufacturers to determine the best way to accomplish that level of performance,” he tells the Monitor.

Sen. John Thune (R) of South Dakota, who heads the Senate Committee on Commerce, Science, and Transportation, has asked the automaker to brief the committee by July 29 on the Florida accident and Tesla's response to it, according to Reuters.