
Tesla is taking a calculated risk by using real customers as beta testers for its still-developing Autopilot software, Bloomberg Businessweek's Zachary Mider writes in this week's cover story.
Why it matters: As new technologies roll out on roads, there is debate over whether it's best to wait for self-driving technology to be perfected, or to put incomplete software on the road where it can save lives as it's improved.
- "It's possible that both sides are right, that the computers are killing a few drivers who otherwise would have lived, but that they’re also saving the lives of many more," writes Mider.
Driving the news: Autopilot, Tesla's assisted-driving software, appears to have played a role in 4 of 5 known fatalities since it was introduced in 2015, Mider writes.
- Among them was Florida's Jeremy Banner, whose sedan failed to spot a tractor-trailer crossing the 4-lane highway ahead of him. His Tesla hit the truck broadside, and he died instantly.
- His family is suing Tesla for making a defective car.
Yes, but: Driving killed 40,000 Americans last year and 1.4 million people globally, per Bloomberg Businessweek.
- Musk has claimed driving with Autopilot is about twice as safe as without it, but there's no published data to prove that assertion, and Tesla's quarterly safety reports are inconclusive.
- He once said it would be "morally reprehensible" to keep Autopilot off the market.
The latest: In a report out today, Consumer Reports tested Tesla's new Smart Summon parking feature and found it was glitchy, sometimes driving "erratically, like a drunken or distracted driver."
- Noting that Tesla customers paid $6,000 upfront for self-driving features that are not complete, Jake Fisher, CR's senior director of auto testing, says: "What consumers are really getting is the chance to participate in a kind of science experiment. This is a work in progress."