For almost as long as heâs been CEO of Tesla, Elon Musk has been bullshitting us about self-driving cars.Â
Technology
The bill finally comes due for Elon Musk
Tesla CEO Elon Musk is finally forced to stop hiding behind his bluster and show off a real autonomous vehicle. Anything less would be a disaster.
In 2016, he said Tesla self-driving cars were âtwo years away.â A year later, it was âsix months, definitely,â and customers would be able to actually sleep in their Tesla in âtwo years.â In 2018, it was still a âyear awayâ and would be â200 percent saferâ than human driving. In 2019, he said there would be âfeature complete full self-driving this year.â There hasnât been a year go by without Musk promising the imminent arrival of a fully driverless Tesla.Â
This week, itâs finally here. Or at least thatâs what Musk says.
On October 10th, Tesla will reveal its long-awaited ârobotaxi,â a supposedly fully autonomous vehicle that Musk has said will catapult the company into trillion-dollar status. It will be some combination of âUber and Airbnb,â Musk said during a recent earnings call, allowing Tesla owners to serve as landlords for their driverless cars as they roam about the cityscape, picking up and dropping off strangers. And it will be futuristic in its design, with Bloomberg reporting that it will be a two-seater with butterfly wing doors. Musk has been calling it the âCybercab.â
The event, which will be held on the film lot of Warner Bros. in Burbank, California, will be the culmination of almost a decade of blown deadlines and broken promises from Musk, a moment when the richest man in the world will finally be forced to stop hiding behind his own bluster and actually show us what heâs been working on.Â
Itâs a vulnerable time for Tesla. The companyâs sales slumped in the first half of the year, as rising competition in the US and China dimmed Teslaâs star. Musk is fighting to reclaim his enormous $56 billion pay package, all while spreading misinformation on his social media platform and stumping for former President Donald Trump. And now thereâs this product event, Teslaâs first since the unveiling of the Cybertruck in 2019.Â
Almost a decade of blown deadlines and broken promises
Based on past Tesla events, donât expect Musk to follow through on all his promises.Â
It seems likely that weâll see a cool demo of a stylish-looking prototype, allowing Musk to claim a kind of victory for first impressions, even when the rough outlines of what he promises will barely hold up to scrutiny. The exaltations from bullish investors will give him enough cover to continue to make misleading declarations about what is and isnât autonomous. And the safety experts and competitors who try to warn about the dangers of his approach will likely be drowned out or dismissed by his most ardent fans.Â
But either it works or it doesnât. Waymo and others have already shown the world what real driverless technology looks like. Itâs imperfect and itâs limited, but itâs undeniable. If Musk fails to deliver or shows off some obvious vaporware, his reputation â and Teslaâs stock price â could take a real hit.Â
âHeâs really grasping for straws,â said Mary âMissyâ Cummings, a robotics expert and former senior safety official at the National Highway Traffic Safety Administration. âHeâs so desperate to try to drive more money into this equation that heâs doing things like this [event].â
âThe hardware neededâ
I first started covering Tesla for The Verge in 2016, the same year that Musk made one of his first predictions about the imminent arrival of self-driving cars. âYouâll be able to summon your car from across the country,â he said, citing as an example a Tesla owner beckoning their vehicle to drive solo from New York to meet him in Los Angeles. The company went even further in a blog post, boasting that âall Tesla vehicles produced in our factory â including Model 3 â will have the hardware needed for full self-driving capability at a safety level substantially greater than a human driver.â
That post has since been deleted from Teslaâs site, along with the companyâs first âMaster Plan,â as Musk attempts to scrub Teslaâs past of all his overreaching pronouncements.Â
âSubstantially greater than a human driverâ
But more importantly, these kinds of statements fooled a lot of people into thinking the shiny new electric car in their driveway would have everything they needed to be fully autonomous and that those futuristic capabilities were just around the corner. Elon Musk would flip the switch and â presto â millions of cars would suddenly transform into robots. The media bought into it, portraying Tesla as being on the cusp of a historical evolution. And soon enough, the companyâs stock started reflecting this attitude, especially after Tesla defied expectations with the Model 3.Â
Of course, none of it was true. Nearly a decade later, no Tesla vehicle on the road today is autonomous. Sure, the company has rolled out a series of brashly branded driver-assist features â first Autopilot, then Navigate on Autopilot, then Full Self-Driving, and finally Full Self-Driving (Supervised) â but they do not enable the car to drive without constant human supervision.
You canât sleep in your Tesla. You canât summon it across town, let alone across the country. If you crash, you will be liable for what happens and who gets hurt. And if you attempt to fight the company on any of that, you will probably lose.Â
You canât sleep in your Tesla
Even those Tesla owners lured into thinking their vehicles were incognito robots would soon realize the cost of the companyâs obfuscations. In 2021, Tesla first started offering subscriptions to its long-awaited Full Self-Driving feature, including a $1,500 hardware upgrade for those early owners who were wrongly informed that their vehicle would have âthe hardware neededâ for full autonomy. (It was later lowered to $1,000 after customer outcry.)
There are plenty of people using Full Self-Driving (Supervised) today who will happily tell you how great it is and how they canât imagine life without it. (Many also have YouTube channels they want to promote.) They will also argue over the semantics of autonomy. Shouldnât something that controls the acceleration, braking, steering, and navigation also get to be called autonomous?Â
In the absence of data from Tesla, itâs impossible to say how good or terrible FSD is with any certainty. Crowd-sourced projects like FSD Community Tracker are extremely limited, only featuring data on a scant 200,000 miles of driving. Tesla says over 1 billion miles have been driven using FSD. But even the trackerâs tiny snapshot of data shows 119 miles between critical disengagements. Waymo drove 17,000 miles between disengagements in 2023, according to the California DMV.Â
While Tesla chased a much broader vision, Waymo leapt forward by realizing something more workable: remove the driver entirely and restrict the geography in which the vehicle can operate. Google, from which Waymo spun out in 2016, has long argued that advanced driver-assistance systems like Autopilot and FSD were inherently problematic. After all, human supervisors get bored and eventually zone out. The handoff between the vehicle and the driver can be fraught. Itâs better to just cut the human out of the equation altogether.Â
Tesla is now latching onto Waymoâs better vision in unveiling a fully autonomous vehicle, the robotaxi. This is the vehicle that can silence all of those doubters. After all, Waymo doesnât sell cars; it sells a service. Tesla sells cars. And wouldnât it be infinitely cooler to own your own self-driving vehicle?Â
âYouâre killing peopleâ
Tesla likes to say that Autopilot â and later FSD â is saving lives. In fact, Musk has gone even further, declaring any criticism of its driver-assistance products amounts to murder. âYou need to think carefully about this,â he said in 2016, âbecause if, in writing some article thatâs negative, you effectively dissuade people from using an autonomous vehicle, youâre killing people.â
At the same time, he said that Tesla had no plans to assume legal liability for crashes or deaths that occurred when Autopilot was in use unless it was âsomething endemic to our design.â
Even in the annals of Musk quotes that have aged poorly, these rank up there. At the time, only one person had died while using Autopilot â a reflection, perhaps, of the small number of Tesla vehicles on the road. Now, there are over 2 million Teslas all over the globe and a substantially higher number of deaths.Â
âSomething endemic to our designâ
Presently, federal regulators are investigating at least 1,000 individual Tesla crashes involving Autopilot and FSD. Of those crashes, at least 44 people died. Investigators found that Autopilot â and, in some cases, FSD â was not designed to keep the driver engaged in the task of driving. Drivers would become overly complacent and lose focus. And when it came time to react, it was too late.
Tesla has pushed out numerous updates to FSD over the years, so it can be tough to pin down what exactly is wrong with Teslaâs approach. Often, users flag a problem â the vehicle fails to recognize certain signage or a specific driving maneuver â and almost just as quickly, Tesla has an update available. That seems like a good thing â Tesla is responsive to problems and moves quickly to fix them â until you remember that real peopleâs lives are at stake. And the pedestrians and cyclists outside the vehicle never consented to participating in this experiment to teach cars to drive themselves.Â
Even the most recent version of the FSD software has its faults. An independent research firm recently tested versions 12.5.1 and 12.5.3 for over 1,000 miles and found it to be âsurprisingly capable, while simultaneously problematic (and occasionally dangerously inept).â When errors occur, âthey are occasionally sudden, dramatic, and dangerous.â In one instance, the groupâs Tesla Model 3 ran a red light in the city during nighttime even though the cameras clearly detected the lights.
FSD is the foundation for the robotaxi. Everything has been leading up to this moment. But the system struggles with basic perception issues, like wet roads and sunlight glare. FSD struggles to recognize motorcyclists: a 28-year-old motorcycle owner was killed outside of Seattle earlier this year by a Model S driver who was using the driver-assist feature.Â
The system struggles with basic perception issues
Tesla used to publish quarterly safety reports that it would claim proved that Autopilot was safer than regular human driving â but it then stopped suddenly in 2022. It started up again this year with a new report that says there is only one crash for every 6.88 million miles of Autopilot-assisted driving, versus one for every 1.45 million miles of non-Autopilot driving. Thatâs over four times safer than normal human driving, according to Tesla.Â
This is the only safety data we have for Teslaâs driver-assist technology that is supposed to be a precursor to the fully autonomous robotaxi. But according to Noah Goodall, a civil engineer who has published several peer-reviewed studies about Tesla Autopilot, the companyâs safety reports fail to take into account basic facts about traffic statistics, such as that crashes are more common on city roads and undivided roads than on the highway, where Autopilot is most often used. And it led him to the conclusion that Tesla may be miscounting crashes in order to make Autopilot seem safer than it actually is.
âThey fell apart pretty quickly, once you dove in just a little bit,â Goodall told me. âI have trouble publishing on this sometimes. Just because the reviewers are like, âEveryone knows these are fake, why are you pointing this out?ââÂ
âA monumental effortâ
If thereâs one thing on which everyone can agree, itâs that Tesla has a lot of data. With nearly 5 million drivers on the road globally, each vehicle is sending huge amounts of information back to the mothership for processing and labeling. Other companies, with only a fraction of the real-world miles, have to use simulated driving to fill in the gaps.
But the sheer volume of data that Tesla is processing is overwhelming. The company relies on a small army of data annotators who review thousands of hours of footage from Tesla owners and the companyâs in-house test drivers. And according to Business Insider, those workers are pushed to move quickly through as many images and videos as they can or face disciplinary action. Accuracy is secondary to speed.Â
âItâs a monumental effort,â Cummings, the robotics expert, said. âPeople think Teslas are learning on the fly. They have no idea how wrong they are, and just how much human preparation it takes to actually learn anything from the terabytes of data that are being gathered.â
Teslaâs approach to the hardware of driverless vehicles also diverges from the rest of the industry. Musk infamously relies on a camera-only approach, in contrast to the widely used practice of relying on a âfusionâ of different sensors, including radar, ultrasonic, and lidar, to power autonomous driving. Musk calls lidar, in particular, a âcrutchâ and claims any company that relies on the laser sensor is âdoomed.â Waymoâs robotaxis are adorned with large, obvious sensors, a style expressly at odds with the sleekness of Muskâs vehicles.Â
Of course, Tesla does use lidar on its test vehicles, but just to validate FSD. They wonât be going on any customer cars, since lidar is still too expensive. With its tens of thousands of laser points projecting a second, lidar provides a critical layer of redundancy for the vehicle as well as a way to visualize the world in three dimensions.
The idea that you can introduce a fully autonomous vehicle without the full suite of sensors that power every other AV on earth strains credulity for most experts on the technology.Â
âWhy on earth would you want to tie one hand behind your back when youâre solving an almost impossible problem?â said Phil Koopman, an AV expert from Carnegie Mellon University. âAnd we know itâs going to be big bucks, so donât skimp on the hardware.â
High five
What is an autonomous car? It sounds like a simple question, but the answer is trickier than it seems. To help clear things up, SAE International, a US organization that represents automotive engineers, created a six-step guide to automation. Intended for engineers rather than the general public, it ranged from Level 0, meaning no automation whatsoever, to Level 5, meaning the vehicle can drive itself anywhere at any time without any human intervention.
And thereâs plenty of room for error and misunderstanding. A problem weâve seen is what researcher Liza Dixon calls âautonowashing,â or any effort to overhype something as autonomous when itâs not.Â
Most experts dismiss Level 5 as pure science fiction. Waymo and others operate Level 4 vehicles, but very few people really believe that Level 5 is attainable. Level 5 would require âan astronomical amount of technological development, maintenance, and testing,â Torc Robotics, a company developing self-driving trucks, says. Others call it a pipe dream.
Except Musk. At a conference in Shanghai, Musk said with supreme confidence that the company âwill have the basic functionality for Level 5 autonomy complete this year.â That was in July 2020.
Heâll likely try to pass off the Tesla robotaxi as the endpoint of this achievement, the vehicle that will help usher in this wildly implausible goal. And itâs important to see through the bluster and bullshit and measure it against what heâs promised in the past and also what other players have already achieved.Â
Teslaâs history is littered with fanciful ideas that never panned out â like a solar-powered Supercharger network, battery swapping, or robotic snake-style chargers. But Musk never bet his entire company, his reputation, and most importantly, his net worth, on those projects. This one is different. And soon enough, weâll know whether the Tesla robotaxi is the exception to the rule or just another guy dancing in a robot costume.Â