‘Elon Musk’s Crash Course’: 3 key points from Tesla’s documentary

Placeholder while loading article actions

By now, everyone knows the two-pronged promise Tesla has made for almost a decade: the automaker aims to revolutionize both cars’ relationship with the environment (through gasoline-free electric power) and consumer safety on the roads (thanks to autonomy-driving capabilities). Tesla CEO Elon Musk has long been fond of pointing out that road deaths would decrease if driving wasn’t in the too-human hands of the driver – and has promised that one day traveling by car will be like taking an elevator . “You tell him where you want to go, and he’ll take you there with extreme levels of security.”

The cars are certainly electric. That second goal, according to a new documentary, has proven more elusive.

Informed by reporting by Cade Metz and Neal Boudette of The New York Times, director Emma Schwartz’s “Elon Musk’s Crash Course” raises a skeptical eyebrow at Tesla’s vaunted Autopilot feature, sometimes described as its software. autonomous driving. He argues that Autopilot failed to deliver and lives were put at risk as a result. Here are three key arguments that Schwartz’s film puts forward.

1. Despite Tesla’s claims that its technology would revolutionize cars to make them safer, its cars sometimes failed to recognize certain safety threats on autopilot – and Tesla drivers had road accidents fatal while using it.

According to “Elon Musk’s Crash Course,” a 2016 investigation by the National Highway Traffic Safety Administration (NHTSA) found that some 38 Tesla crashes occurred in the United States while the cars were on autopilot mode, but the film details three in which drivers were killed.

The first is that of Josh Brown, a bomb disposal officer for the US Navy during the Iraq War and the founder of a company that aimed to extend Internet service to rural America. Described by friends as a tech enthusiast, Brown loved his Tesla and often shot videos while driving. When Musk retweeted one such video in April 2016, in which the car on autopilot swerved out of the way of an overly aggressive merging truck, Brown was thrilled.

Brown was driving in the same mode through Williston, Fla., after leaving Disney World the following month when his Tesla rolled under a tractor-trailer without slowing down. Brown, 40, was killed in the collision. (Despite rumors that Brown had watched a movie, the documentary makes it clear that no movie was found on Brown’s laptop. Yet NHTSA and the National Transportation Safety Board, or NTSB, found that Brown was at fault because he wasn’t paying attention to the road.) In the film, Musk is heard in an audio recording later saying that the radar upgrades that were added to the Autopilot software after Brown’s accident could have saved Brown’s life.

Elon Musk says Tesla’s ‘fully self-driving’ is questionable

In March 2018, 38-year-old Apple engineer Walter Huang died when his Tesla, operating on autopilot mode, hit a concrete barrier in Mountain View, California at more than 70 mph. Former NTSB Chairman Robert L. Sumwalt said onscreen that Huang had played a video game.

And in March 2019, Jeremy Banner, 50, was killed in another traffic accident in Florida, almost identical to the one that killed Brown. The Tesla was on autopilot when a tractor-trailer crossed the road. Banner’s car did not recognize the side of the vehicle in direct sunlight and drove under it, ripping the roof off.

Sumwalt alleges in “Crash Course” that Tesla ignored its safety recommendations after crashes. “When innovation is implemented, we have to make sure it’s done safely,” he says, “or it’s going to be the Wild West out there.”

2. Some former Tesla engineers had private doubts about Musk’s promises to the public about the Tesla’s ability to drive itself.

Despite Musk’s claims from 2015 that self-driving cars were essentially a “solved problem” and that the problems were simply being worked out, several former staffers claim in “Crash Course” that it wasn’t the case behind closed doors.

They say, for example, that some decisions were made somewhat arbitrarily, such as the decision to use cameras instead of a popular radar system called lidar. “There was no extensive research phase where various vehicles were fitted with an array of sensors. Many team members would have liked that,” says Akshat Patel, engineering program manager at Autopilot from 2014 to 2015. “Instead, the conclusion was reached first, and testing and development activities began, to prove that conclusion was correct.

Others allege they feared Autopilot technology was being sold and used by people who believed it would provide the same elevator-like ride experience that Musk once described — drivers who believed that they could walk in, provide a destination, then sit back and relax. When Brown’s accident happened, “I knew people trusted the system to do things it wasn’t designed or capable of,” says JT Stukes, senior project engineer at Tesla de 2014 to 2018. “The fact that this kind of accident happened is obviously tragic. But it was going to happen.”

Raven Jiang, an engineer who also worked on Autopilot at Tesla from 2015 to 2016, notes that around the same time, Elizabeth Holmes’ transgressions at Theranos were exposed to the public. “Some of these stories were in the back of my head,” Jiang says. “It certainly made me wonder a lot more about what’s behind some of that public optimism.”

3. Tesla enjoys substantial public support anyway.

The most recent images included in “Crash Course” are from last month. Musk, wearing a black cowboy hat and black aviators, smiles on stage in front of an enthusiastic and delighted crowd at the launch party for Tesla’s new Gigafactory in Austin. Partygoers hold up their phones to film him talking – a stark reminder that Musk is a mega-celebrity and a hero to many.

Tesla test drivers believe they are on a mission to make driving safer for everyone. Skeptics say they are a safety hazard.

A Tesla owner, Alex Poulos, points out that Musk superfans sometimes call themselves “Musketeers”. Kim Paquette, another Tesla owner who is part of an elite group testing new versions of self-driving software, shows off her collection of HotWheels-sized Teslas and says she’s “honoured” to be part of the testing process. “People who buy a Tesla understand that it’s not self-driving yet,” she says. Even Brown’s family says “part of Joshua’s legacy is that the accident [that caused his death] leads to further improvements, making the new technology even safer,” in a statement read on their behalf during a building dedication for him. “Our family is heartened and proud that our son is having such a positive impact on future road safety.”

And yet, says Poulos, “fully autonomous driving is what I paid for and I don’t have it. It’s just there in the name of it, right? And I don’t think that’s fair to say.

“Musk, I think he has a huge responsibility,” he adds. “I think he needs to be a bit more careful about what he says to his followers.”

Comments are closed.