Just over a year ago, Tesla sent out a software update to its cars that made its “Autopilot” features available to customers, in what the company called a “public beta test.” In the intervening 12 months, at least one customer died while the Tesla was in autopilot mode. Cars have crashed, regulators have cracked down, and the headlines proclaiming that “Self-Driving Cars Are Here” were replaced with Tesla’s assurances that autopilot was nothing but a particularly advanced driver-assist system.
Given all this, one might assume that a chastened Tesla would take things more cautiously with its next iteration of autonomous technology. But at a launch event this week, Tesla introduced its Autopilot 2.0 hardware with the promise that all the cars it builds from now on will have hardware capable of “the highest levels of autonomy.”
Tesla’s proof that its new hardware is capable of driving in a “complex urban environment” was a brief, edited video of the system navigating the area around its headquarters near Stanford University in California. Though exciting for enthusiasts who can’t wait to own a self-driving car, the video is hardly proof that Tesla’s system is ready to handle all the complexities that are holding back other companies that have been working on autonomous technology for longer than Tesla. As impressive as Tesla’s system is -- and make no mistake, it is deeply impressive -- navigating the Stanford campus is a hurdle that even graduate school projects are able to clear.
Tesla’s new sensor suite upgrades what was a single forward-facing camera to eight cameras giving a 360-degree view around the car. It also updates the 12 ultrasonic sensors, while keeping a single forward-facing radar. Yet independent experts and representatives from competitor firms tell me this system is still insufficient for full level 5 autonomy -- the National Highway Traffic Safety Administration’s highest rating -- which requires more (and better) radar, multiple cameras with different apertures at each position and 360-degree laser-sensing capabilities.
What Tesla’s upgraded hardware does do is vastly improve the company’s ability to pull high-quality data from its vehicles already on the road, giving it an unrivaled ability to comply with new regulatory guidelines requiring granular data about autonomous-drive functions in a variety of conditions.
Whereas its competitors’ autonomous-drive programs harvest data from small test fleets and extrapolate from there, Tesla has made every car it sells into an independent experiment of conditions that can only be found on the open road. All this real-world data gives Tesla a unique opportunity to validate its autopilot technology. If the company had announced Autopilot 2.0 as another step toward an eventual fully autonomous system, this would be an unambiguously good (if not earth-shattering) development.
Unfortunately, that’s not what Tesla did. Instead, in Wednesday’s launch events, it called its new hardware suite “full self-driving hardware.” It said the technology would demonstrate the system’s ability to drive cross-country without any human intervention. Tesla even hinted that a feature will allow its cars to be rented out as autonomous taxis when not in use by their owners.
Though Tesla’s website noted that many of these features will need validation and regulatory approval, this caveat was lost in the hype. As with Autopilot 1.0, Tesla is again inviting a mismatch between owner/operator expectations and its systems’ true capabilities without any apparent recognition that this gap -- not technical failures of the system itself-- is the key point of concern for regulators and critics.
Company founder Elon Musk demonstrated his own inability to understand this important distinction when he launched a pre-emptory attack on the press: “In writing some article that’s negative, you effectively dissuade people from using an autonomous vehicle, you’re killing people.”
Musk is correct that human drivers are extremely dangerous and that autonomous-drive technology has the potential to save lives. But media criticisms of autopilot have centered on the danger of assuming such a system is more capable than it really is. When an Ohio man named Joshua Brown died in a Tesla operating in autopilot mode earlier this year, a DVD player was found in the wreckage, which many took as an indication that he believed the system was capable enough to properly self-drive when it fact it wasn’t.
By pointing out this gap between perception and reality, the media reaction to Brown’s death may have saved lives that Tesla’s overhyping of its autopilot’s capabilities had endangered. Unfortunately, by casting healthy skepticism as potentially lethal Luddism, Musk is extending his pattern of making bad-faith arguments about autopilot’s safety.
This seems an odd moment for Tesla to again invite such misperceptions. Chinese regulators have forced it to remove the word for “self-driving” from its website there; German regulators are sending letters to owners warning that autopilot is not autonomous; and California officials are considering banning the use of “autopilot” and other confusing terms used to market semi-autonomous systems.
The simplest explanation is that in the race toward true autonomous-drive capability, Tesla boosts its brand prestige by suggesting that its current cars are already there. But it’s also worth noting that over the last quarter the company slashed prices on its cars to meet its full-year delivery expectations. What better way to boost sales than to make every car Tesla has already sold instantly obsolete in the eyes of consumers chomping at the bit for full autonomy?
We should celebrate that Tesla has taken another important step toward eventual autonomy. But until the company’s public statements and sales strategy are dialed back to reflect only what each update is actually capable of, Tesla -- not press critics or regulators -- represents the greatest threat to the lifesaving promise of self-driving technology.
By Edward Niedermeyer
Edward Niedermeyer, an auto-industry analyst, is the co-founder of Daily Kanban and the former editor of the blog The Truth About Cars. –Ed
Bloomberg