laitimes

Musk, how did you screw up automatic driving step by step?

Focus:

  • Seven years ago, Tesla launched Autopilot, a high-profile self-driving solution. Two years ago, Tesla encountered an unprecedented supply chain crisis, and Musk decided to do whatever it took to reduce costs. This includes the autonomous driving business. But assertive, paranoid, eager to achieve, and the character flaws of the dominant person almost made Tesla's automatic driving "overturn";
  • In May 2021, Tesla announced the elimination of all radars on its new car. North American media on more than a dozen Tesla former employees, test drivers, safety officials and technical experts, everyone generally believes that suddenly cut off the vital radar, many problems began to break out in a concentrated manner, Tesla collisions and safety accidents also increased significantly;
  • 3 Musk's capricious leadership style has also had side effects on FSD. At the company, where the workaholic boss forced people to develop technology at breakneck speed and bring it to market before it was sufficient, several engineers said they were very concerned that FSD was unsafe on public roads even today;

Since the day Tesla was founded, Elon Musk has a strong desire to make electric cars run on the road by themselves.

It's a tough subject, and it's a project that's enough to burn money. It wasn't until Investor Day in 2016 that Musk set the tone for autonomous driving in the second chapter of the Master Plan Part Deux, and later launched Autopilot, a high-profile self-driving solution.

Two years ago, Tesla encountered an unprecedented supply chain crisis, and Musk decided to do whatever it took to reduce costs. This includes the autonomous driving business.

In order to save costs, Musk first targeted the solution of the car's perception layer, he felt that the eight cameras on the Tesla body were enough, and the vision-led solution could completely replace the expensive lidar.

Assertive, paranoid, eager to achieve results, and the character flaws of the dominant person almost made Tesla's autopilot "overturn". Over the years, Musk's self-driving business has not been as smooth as expected, recently, the Washington Post did an in-depth investigation of Tesla, NE Times sorted out relevant content to feed readers.

|In a word, cut down the radar|

"8 cameras, that's enough."

Musk's decision to adopt a purely visual solution made many executives drop their jaws. He has repeatedly dismissed the opinions of front-line engineers, convinced that the camera-led route is simpler, cheaper and more intuitive, and he even stated on his Twitter that the road system is designed for cameras (eyes) and neural networks (brains).

The pure visual route has attracted the opposition of many executives, without lidar, once the camera is blocked by raindrops or bright light, the vehicle is prone to basic perception errors, and even the danger of crashing, how should Tesla respond?

In May 2021, Tesla announced the elimination of all radars on its new vehicles. In the North American media's exchanges with more than a dozen former Tesla employees, test drivers, safety officials and technical experts, it was widely believed that the sudden cutting of the vital radar, many problems began to break out, and Tesla's collisions and safety accidents also increased significantly.

One former Tesla Autopilot engineer even bluntly said that using FSD on the street is unsafe because you can't predict what the car will do next.

These former employees said that after Tesla announced that it would abandon the radar, vehicle safety accidents were frequent, and the full self-driving test program at that time was expanded from thousands of drivers to tens of thousands, according to the feedback of these owners, a series of problems such as "ghost brakes" began to break out, including the system misreading road signs, or not detecting obstacles such as emergency vehicles.

The surge in "ghost brake" accidents stemmed from a lack of radar.

Data from the National Highway Traffic Safety Administration (NHTSA) shows that Tesla experienced a surge in the rate of such accidents last year, which also prompted federal regulators to launch an investigation into its FSD. The data shows that complaints about "ghost brakes" rose to 107 in the past three months, compared with 34 in the previous 22 months.

NHTSA received about 250 complaints in two weeks after the issue was reported in overseas media, and the agency said it had received 354 complaints over a nine-month period before having to investigate.

"The purely visual route, not the only reason they are in trouble, but the key reason." Missy Cummings, a former NHTSA senior safety adviser, criticized Tesla's approach and chose to shy away from matters related to Tesla.

In the eyes of many former employees who left, the trouble encountered by Tesla's self-driving business came from an unrealistic cost reduction and Musk's desperate decision to cancel the radar. However, many of the company's current difficulties can be attributed to Musk's character flaws.

| impetuous, short-sighted, eager to get things done|

At Tesla Investor Day in April 2019, Musk made a bold prediction, he patted his chest and told a room full of investors and personnel media that by the middle of the following year, Tesla will have more than 1 million Tesla cars equipped with FSD on the road -

"The software can be updated through the online transmission system, the autonomous driving is very reliable, and the driver can sleep on the road."

The following year, Tesla's stock soared to become the world's most valuable automaker, and Musk benefited from it to become the world's richest man. However, these bright faces conceal unreasonable development strategies, for example, unclear thinking, incoherent decision-making, and unrealistic implementation.

Tesla insiders say rivals such as Waymo and Apple take a different approach, setting draconian testing rules, such as severely limiting where self-driving software can run. But Tesla rushed to roll out FSD to 360,000 owners, and those owners paid an additional $15,000 to qualify for these features.

About two years ago, a video went viral on YouTube with a video of Tesla relying on FSD to make his way through Lombard Street in San Francisco, famous for its 8 sharp turns and known as the most tortuous street in the world, attracting tens of thousands of viewers.

But according to former data labeling expert John Bernal and others, Tesla engineers quietly set up invisible barriers in software, similar to bumpers in bowling alleys, to help the car stay on the road.

This is the "real secret" that the vehicle can run normally in the video. Tesla's self-driving test engineer bluntly said that he had driven Tesla vehicles to test the twists and turns of Lombard Street at work, and the user experience and smoothness were far from the effect shown in the video.

Interestingly, Musk was very impatient with the progress of FSD, requiring the test team to make frequent bug fixes to the software, as well as engineers to step in and adjust the code. A former Tesla executive recalled that an engineer at FSD once spat bitter water at him that "no one comes up with a good idea to solve a problem when they are chased by tigers."

Musk's capricious leadership style has also had a side effect on FSD. At the company, where the workaholic boss forced people to develop technology at breakneck speed and bring it to market before it was sufficient, several engineers said they were so worried that FSD was unsafe on public roads even today.

John Bernal, a former data annotation expert who works in Tesla's Autopilot division, believes that the internal progress of FSD is very slow, Musk has been tweeting about how FSD is successful, but the engineers inside know very well that their products and expectations are far from the same.

He joined Tesla in 2020 and was a technology fan of Tesla and opened YuTube to show everyone Tesla's latest technology, but in February 2022, he was fired because he showed a video showing the FSD Beta system maneuvering the vehicle while driving in San Jose and directly hitting the roadside pillar.

|Can't keep talent|

Misalignment, more and more autonomous driving team members are choosing to leave, including executives who are in the company's own right.

Several former engineers who left Tesla said that in late 2020, employees on the Autopilot team turned on their computers and found that they had workplace monitoring software installed inside. The monitoring software monitors employee computer keystrokes and mouse clicks and tracks their image tags, and if the mouse does not move for a period of time, the timer will start, and the employee may be reprimanded or even fired by superiors for "not working for a long time."

Office computers are installed with surveillance, which touches the psychological defense of employees. Although Tesla officially explained that the reason for time monitoring of image labeling is to improve the efficiency of labeling software and calculate the time required to mark images.

Engineers, exhausted.

More and more talent is choosing to leave Tesla, quit their jobs, and look for job opportunities elsewhere. Andrej Karpathy, Tesla's director of artificial intelligence, took a month's leave last year to work this year at language modeling software ChatGPT, which just launched OpenAI, which shocked the industry.

There is information that Tesla Autopilot director Ashok Elluswamy is already working on Twitter, but a former employee with insider knowledge of the matter said that the tech guru has decided to go to Waymo.

In the self-driving business, Waymo and Tesla have taken two completely different routes, the former has never really doubted whether its car will have large-scale accidents in the future, such as hitting a stop sign on the road. They focus on long-termism rather than rushing results. ”

Not only that, Musk is often very resistant to his subordinates' advice, and in the past few years, he has fired several Tesla employees who opposed his deployment.

Last year, Musk acquired Twitter, which left him as Tesla's CEO unable to focus. The CEO is distracted. Several employees familiar with the inside story revealed that after the acquisition in October, Musk mobilized dozens of engineers, including those of driver assistance and fully autonomous driving, to work with him on Twitter -

This further hindered Tesla's development. The immediate effect is that software updates, which were originally released every two weeks, are suddenly separated by months.

|US government, it's time to strike|

Bloomberg reported that the U.S. Securities and Exchange Commission is investigating Musk, mainly for his role in promoting Tesla's self-driving propaganda and a series of statements.

The lawsuit notes that Tesla has repeatedly made "false and misleading" statements, saying the company has "grossly exaggerated" the safety of autonomous and fully automated driving.

What Tesla has done in the field of self-driving in the past few years is not even the head of the US Department of Transportation. Recently, U.S. Transportation Secretary Pete Buttigieg directly "bombarded" Tesla when receiving a Bloomberg social stream, saying that the company's naming of driver assistance systems did not conform to common sense.

In his opinion, using the term Autopilot is inconsistent with Tesla's current requirement that drivers keep their hands on the wheel.

Interestingly, NHTSA, which has been conducting a safety investigation on Tesla in previous years, is a subordinate unit under the jurisdiction of the Ministry of Transportation in charge of Pete Buttigieg, so in the face of the public questioning of this "big man", Musk does not dare to snort easily.

In August 2021, NHTSA launched an investigation into possible hidden dangers in Autopilot, which aroused widespread public attention, and later until early 2022, the agency began to investigate Tesla's sudden braking.

However, for Musk, who has always loved to fight, soldiers come to block, water to cover, does not affect him to stand in the center of the stage and continue to dance for his own products.

At the Tesla investor conference just ended last month, Musk also deliberately left space to release FSD collision data to prove that the safety of his products is higher than that of human drivers, and of course, he also wants to prove that the automatic driving safety system is 5-6 times higher than the average driving in the United States.

He even gave a set of data, using FSD Beta, 3 collisions per 10,000 miles traveled, compared to 1 collision for every 2,000 miles traveled by American drivers.

Read on