Elon Musk’s Full Self-Driving Promises: Unrealistic, Dangerous

Elon Musk's Full Self-Driving Promises: Unrealistic, Dangerous
Guinea pigs are lovely creatures, but you probably don’t want to be one to fatten Tesla’s bottom line.

More than at any other time in the history of the automobile, the possibilities of self-driving cars are arriving, and quickly. However, every reputable organization in the field acknowledges that the technology, while progressing rapidly, is not nearly ready to be deployed in every environment with the necessary levels of safety for public acceptance. Tesla, however, in its insatiable desire to be first, has been repeatedly pushing the envelope by making autonomous technology ever-more available to the public. The consequences have not gone unnoticed; while most Tesla drivers have used their vehicles’ extended features responsibly, a significant minority has, naturally, begun testing the Autopilot functionality in ways Tesla technically discourages but tacitly encourages. Joshua Brown paid for this experimentation with his life. Elaine Herzberg, who had nothing to do with a Tesla, also died at the hands of autonomous technology (in an Uber-modified Volvo XC90, a vehicle from a company renown for its commitment to safety). As a result, you’d hope to see more caution from the figurehead of one of the most innovative car companies on the planet than what you’re likely to see from Elon Musk’s Twitter feed these days:

Self-driving Teslas just months away, promises Musk

Tesla’s cars will in August suddenly activate “full self-driving features,” the company’s chief executive Elon Musk tweeted on Sunday, three days after federal investigators said a Tesla SUV driving semi-autonomously had accelerated over 70 mph and smashed into a highway barrier.

Elon Musk's Full Self-Driving Promises: Unrealistic, Dangerous
Self-driving technology is on the way, but it won’t be safely here by the end of summer, no matter what Musk says.

There are a number of things wrong with such a statement, but let’s just focus on the most obvious one: the technology still isn’t ready for full time, hands-free, always-ready, mainstream deployment. This has been made obvious most directly (and most painfully, from the perspectives of their families) through the deaths of individuals like Joshua Brown, who was beheaded in 2017 while, per reports, either sleeping, reading, or watching videos in his Model S when it drove under a semi trailer at full speed under control of Autopilot. It was recently shown through the 2018 death of 38-year-old Walter Huang, who died in March while his Autopilot-driven Model X accelerated and veered into a barrier on the highway before bursting into flames.

Huang, like Brown, had not been driving his Tesla before it took his life, strongly indicating that both men had placed their lives in the hands of Tesla’s technology. While Tesla has consistently stated that Autopilot is not a replacement for a human driver and that drivers are always required to maintain control over their vehicles, they released semi-autonomous technology to the market with full knowledge that people were going to behave like people, which means going, “look Ma, no hands!” more often than not when given the chance. This is irresponsible, and I’m not the only person to point this out.

Tesla wants guinea pigs to get the bugs out of their technology; will you be one of them?

“Tesla has a history of using consumers as guinea pigs,” said David Friedman, the director of cars and product policy at Consumers Union, the advocacy arm of Consumer Reports. Tesla’s “misleading” marketing, he said, has had the dangerous effect of “providing overconfidence and building you up to thinking it’s safer than it actually is.”

Elon Musk's Full Self-Driving Promises: Unrealistic, Dangerous
Autonomous driving is like a wooden tower. If you build it too quickly, it’s going to come crashing down.

Friedman nails the key issue with Musk’s ill-advised, but thoroughly intentional tweet. Musk is a marketer–one of the most successful in recent times–and the more hype he generates about Tesla, the more stock shares rise, the more money the company raises, and the more people are willing to write thirty-, forty-, one-hundred thousand dollar checks for the hope and promise of driving vehicles that drive themselves. The problem, of course, is that the vehicles, while impressive, aren’t ready to be fully trusted to fully drive themselves. As a result, anyone driving one and expecting it to work as a fully self-driven vehicle may pay for it someday with his or her life.

Elon Musk's Full Self-Driving Promises: Unrealistic, Dangerous
…and you don’t need to be in an autonomous vehicle to be at risk of being run over by one.

However, as Edward Thorp would teach us, there are negative externalities, or additional consequences, that come into play when someone decides to turn on an Autopilot-enabled (or addled) Tesla and turn off his or her brain. Beyond the driver, any other occupants in the vehicle are now risking their lives, whether they wish to or not. Any other occupants of any other vehicles on the road are now at risk, as one never knows if the Tesla approaching in the opposing lane may be about to veer into one’s path. Pedestrians and cyclists are at even greater risk, as they won’t even have the basic protections of a vehicle around their bodies. These are the risks; we all become Tesla’s guinea pigs when they release their buggy software and encourage people to use it responsibly while wink-winking as people start behaving irresponsibly. Tesla collects data on every mile driven by their vehicles through telemetry, and they pay particular attention once their vehicles are involved in collisions, and they pay extra attention whenever those collisions are serious enough to lead to fatalities or receive media attention, because that’s a sign to move into damage control mode.

Upgrading your car’s brain isn’t as easy as upgrading your phone

Buyers of Tesla’s sedan or SUV, including the $140,000 Model X P100D, can pay an extra $5,000 for “Enhanced Autopilot,” a package of still-experimental features that the company says could include “on-ramp to off-ramp” autonomous freeway driving. Drivers can prepay another $3,000 on top of that for its “Full Self-Driving Capability” package, which the company advertises as “All you will need to do is get in and tell your car where to go.”

But Tesla has shared little about how it has tested these features, Friedman said, adding that the treating of self-driving capabilities as easy software updates could have deadly results.

Elon Musk's Full Self-Driving Promises: Unrealistic, Dangerous
We can’t approach self-driving technology like a slip-and-slide to the finish line.

If this doesn’t give you pause, I’m not sure what will. It’s difficult to tell whether you’re adding features to a MacBook Pro or a 4,000 pound vehicle capable of propelling itself at 70 mph without a driver’s input. Of course, you’re not supposed to use a Tesla that way–yet. But per Musk, you’ll soon be able to. Whether your Tesla is capable of doing so consistently or not. And thanks to Musk, anyone on the road will be at the mercy of any undiscovered bugs in the system. Of course, you could argue that this isn’t any different from the risks we run whenever we approach roadways dominated by human-steered vehicles, which still have a far, far worse safety record than Autopilot or any other mainstream autonomous technology. I’m not implying that the technology isn’t safer than human intervention, because it is. I’m arguing that Tesla’s rushing to be first in a race we’re all going to win in the long run so they can get the glory in the short run. And that’s just not safe for any of us.

If you find my information on best practices in car and car seat safety helpful, you can buy my books here or do your shopping through this Amazon link. Canadians can shop here for Canadian purchases. Have a question or want to discuss best practices? Send me an email at carcrashdetective [at] gmail [dot] com.