If you buy something using links in our stories, we may earn a commission. This helps support our journalism. Learn more. Please also consider subscribing to WIRED
On a recent Saturday evening in South Carolina, Ward Mundy had his hands firmly on the steering wheel of his Tesla Model S when his wife let out a shout. The car was zig-zagging wildly across the road.
“It’s about the most dangerous ride you can imagine when you turn on Autopilot,” says Mundy, a lawyer and car enthusiast. He managed to get the Model S back under control, but it wasn’t the only time he’s had to rely on quick reflexes to avoid an accident. “You can be sailing along at 50 mph and the radar spots [an approaching] bridge and immediately slams on the brakes.”
Sometimes, the car doesn’t react when it should. “The other extreme is that you approach a stoplight with a car already stopped, and the Tesla doesn’t apply the brakes at all,” says Mundy.
Autopilot is the catch-all term for Tesla’s self-driving technologies. Its suite of hardware and software aims to match a car’s speed to traffic conditions, keep a car within or automatically change lanes (called Autosteer), park a car in a nearby spot, or allow it to be summoned from a garage. Elon Musk has said that by the end of this year, an Autopilot-enabled Tesla will drive from Los Angeles to New York without a human needing to touch the wheel at all.
To Mundy, that day seems a long way off. “It’s really a pretty scary experience,” he says. “If you’d ridden in the car with my wife, you would know how many times she’s screamed to turn it off.”
The Mundys’ recent Autopilot experiences are echoed by other Tesla owners in online forums and in YouTube videos of veering cars and near misses.Tesla has been building new Autopilot hardware and software into every car that’s rolled off the production line since November, rapidly rolling out self-driving capabilities before they’re fully tested. Even though many Tesla drivers seem willing to play along, the company’s strategy has some of them worried.
The stakes are high for Tesla as it gambles on this aggressive approach to testing. Owners who want to activate their cars’ Autopilot feature have to pay thousands of dollars extra. If drivers opt not to, the company loses out in its efforts to recoup its costs. But with a growing record of unexpected swerves, fish-tails and other miscalculations, Tesla is risking not only a hit to its largely sterling reputation, but also the lives and safety of some of its biggest fans.
Things looked a lot rosier last March, when Elon Musk got up on stage at Tesla’s Design Center in Hawthorne, California and promised a car with full self-driving technology for just $35,000 (although activating it would cost extra). Tesla’s Model 3 would be the company’s great leap forward: a stylish, speedy electric vehicle smart enough to let you read a book during your commute, or ride-share your robotic chauffeur to earn a few dollars.
The company’s semi-autonomous Autopilot, which relies on radars, cameras, and ultrasonic sensors, had been working well. And Musk’s decision not to use lidar, the powerful but expensive laser-ranging system found in most self-driving rivals, seemed to be paying off.
First introduced in October 2015, Autopilot was an instant hit with owners, who loved its ability to drive on highways with little human control. They were so enthusiastic, in fact, that Musk quickly promised to reign in Autopilot’s hands-free operation to “minimize people doing crazy things with it.”
Sadly, the restrictions did not work for Joshua Brown. In May 2016, the 40-year-old Model S owner activated Autopilot while driving on a highway in rural Florida. The car failed to spot a truck making a left turn in front of him and drove right under it, killing Brown outright. A reconstruction by the National Highway Traffic Safety Administration (NHTSA) indicates that the truck should have been visible to Brown for at least seven seconds before impact, suggesting he was paying no attention to the road ahead.
Tesla took a beating in the press, but the NHTSA did not blame the company for the crash. Its investigation noted that the Autopilot was not designed to cope with crossing path collisions and that it “requires the continual and full attention of the driver.”
But the fatality did have consequences for Tesla. mobilesye, the Israeli company that supplied the original camera and software for Autopilot, cited safety concerns when it pulled out of its partnership with Tesla. The company’s chief technology officer told Reuters that Tesla was “pushing the envelope in terms of safety…[Autopilot] is not designed to cover all possible crash situations in a safe manner.” Tesla says the collaboration ended for commercial reasons.
In October, Musk unveiled mobilesye’s replacement — an all-new Autopilot with a more powerful computer, forward-facing radar, eight cameras, a dozen ultrasonic sensors, and software developed in-house. The hardware would ship on all new Teslas from November, and the software would use fleet learning to quickly gain self-driving skills. In a feature Tesla calls “shadow mode,” every new car constantly feeds sensor data back to Tesla over a built-in wireless link, even if the owner has not paid for Autopilot.
“Before activating the features…we will further calibrate the system using millions of miles of real-world driving to ensure significant improvements to safety and convenience,” wrote Musk.
When the hardware was finally turned on in January, only certain Autopilot features were working, and only at certain speeds. With Autopilot running, even more data finds its way to Tesla’s cloud, helping engineers spot and eliminate bugs. But even taking its limited form into account, some owners judge the new Autopilot significantly less capable than the mobilesye system.
“The very first time I started Autosteer, [the car] did fish back and forth looking for lanes. It was a little scary,” says Steve Hearn, the owner of a Tesla Model S in Tampa, Florida. “Since then, it handled curves and construction areas well, but every time I was in a right lane and came to an exit, it wanted to take it. I had to grab it and pull it back into the traffic lane.”
Some of the system’s quirks are common enough that they have earned their own nicknames. “Truck lust” describes the cars’ tendency to shimmy up alongside 18-wheelers on highways, while “lane dancing” is a wiggle in the Tesla’s dashboard display as it hunts for the right position on the road.
“This happens all the time,” says John Larson, owner of a manufacturing firm near Chicago. “I was sitting at a stop light on Lincoln Avenue the other day [in my Model S]. Lincoln is straight as an arrow for four miles and I’m not moving. So why are the lane lines showing a curve to the left and then straight and then rolling lines? It makes no sense.”
Tesla calls today’s incarnation of Autopilot a “beta phase” and admits that its performance may vary during the initial rollout. It requires drivers to approve a lengthy disclaimer that severely limits the circumstances in which Autopilot should be used and recommends “a higher level of vigilance,” including keeping hands on the wheel at all times. The privilege of acting as a beta tester currently costs between $5,000 and $10,000, depending on which Autopilot features drivers activate and when.
As Tesla has deployed updates wirelessly, some things have improved. Drivers can now use Autosteer on more roads and at higher speeds, and the Traffic Aware Cruise Control has become more reliable. “When you’re rolling along and it puts the brakes on for no reason, that does seem like it’s been happening less,” says Larson. “But it still tries to take exits on highways. And to go through curves on local roads, I have to turn the wheel so hard it disengages Autosteer. It’s like the car is trying to get into an accident.”
It’s an open question how many of the system’s glitches stem from insufficient testing, versus more entrenched flaws in its underlying design. “Having a wealth of data is incredibly powerful but the [software] is also massively important and very difficult,” says Karl Iagnemma, CEO of NuTonomy, a provider of AI systems for self-driving cars. “The algorithmic element is often something that can’t be sped up simply by having access to more data — it’s a process of painstaking development.”
In response to questions about the specific flaws owners identified, Tesla says that it has a unique and robust validation process, and that it monitors customer feedback very closely. Musk said last week that the new Autopilot will reach the speeds of the previous system sometime in March, “and things will only improve from there.” But that still leaves Tesla a mountain to climb to reach a fully self-driving system that can cope with rare but dangerous situations like the one that killed Joshua Brown.
Even if Tesla eliminates all its bugs, Autopilot is what the Society of Automotive Engineers (SAE) calls a Level 2 system — able to handle steering, acceleration, and braking, but relying on an alert human driver to judge when it is safe to use, and to take over at any time. The leap from there to a Level 4 system that works in virtually all situations and can get itself out of trouble — the kind Tesla will need for a cross-country road trip — is vast.
Tesla has released a video showing a vehicle apparently driving itself, but would not say how many miles such cars have covered. The only available data suggests that the company’s program is still in its early stages. California requires companies testing autonomous vehicles on the state’s roads to report their annual mileage, along with any problems the cars have.
Tesla reported zero autonomous miles in 2015, and only 550 miles in 2016 — during which a safety driver had to take control of the car 182 times. (Alphabet’s self-driving spinoff Waymo, for comparison, declared over 635,000 fully autonomous miles in California last year, with just 124 hand-offs to safety drivers.)
When I asked the Tesla owners whether they still use the Autopilot, all said yes. I expected Larson’s response: Although the system is not perfect, he says it helps him with his commute in stop-and-go traffic. But Hearn and Mundy surprised me. “I use the cruise control as much as possible, with the thought that it’s beaming data back to the mothership and is going to help get things going much quicker,” says Hearn.
His wife’s horror notwithstanding, Mundy agrees. “I’m actively using Autopilot because all that stuff gets fed back to Tesla,” he says. “I’m just holding my breath hoping that we all get through this alive.”