laitimes

With 736 cases and at least 17 deaths in four years, Tesla Autopilot's real car accident data has been revealed

With 736 cases and at least 17 deaths in four years, Tesla Autopilot's real car accident data has been revealed

Focus

  • 1 Analysis of data from the National Highway Traffic Safety Administration shows that there have been 736 crashes using Tesla's driver-assisted driving technology over the past four years, killing at least 17 people.
  • 2 The surge in such accidents over the past four years shows that Tesla's driver-assisted driving technology is becoming more widely used, appearing more and more on U.S. roads, and posing great dangers.
  • 3 Many experts say some of Musk's decisions appear to be responsible for the rise in accident reporting, such as broadening the availability of these features and removing radar sensors from vehicles.
  • Musk said Tesla cars are safer than cars driven entirely by human drivers when launching Autopilot, and believes the benefits ultimately outweigh the harms.

Tencent Auto News on June 11, the US media analyzed the data of the National Highway Traffic Safety Administration and found that since 2019, there have been 736 vehicle crashes using Tesla driver-assisted driving technology in the United States, far exceeding the number previously reported. At least 17 people were killed in these accidents. Experts say the $15,000 assisted driving system falls far short of Tesla's promise to realize self-driving cars, and the technology is error-prone.

With 736 cases and at least 17 deaths in four years, Tesla Autopilot's real car accident data has been revealed

The number of crashes involving Tesla's Autopilot has surged

One afternoon in March, 17-year-old Tillman Mitchell stepped off the bus with a flashing red warning light, a police report said. Suddenly, a Tesla Model Y approached North Carolina's Highway 561.

The electric car allegedly activated the driver-assisted driving feature Autopilot at the time, and it showed no signs of slowing down. It hit Mitchell at speeds of more than 70 kilometers per hour. According to his aunt, Dorothy Lynch, Mitchell was rolled into the Model Y's windshield and flew into the air, eventually falling face down on the road. Mitchell's father heard the crash and rushed out of the porch to find his son lying in the middle of the road. Lynch said afterwards: "If it was a younger child, it could have been hit and killed on the spot. ”

Media analysis of data from the National Highway Traffic Safety Administration (NHTSA) found that the accident in Halifax County, North Carolina, is one of 736 accidents involving Tesla's Autopilot function in the United States since 2019. The actual number of Tesla accidents far exceeds previous reports. The data shows that such accidents have spiked in the past four years, indicating that Tesla's assisted driving technology is becoming more widely used, appearing more and more on American roads, and also posing great dangers.

The data shows that Autopilot-related car crashes have also seen a significant increase in deaths and serious injuries. When the National Highway Traffic Safety Administration first announced some of the accidents involving this driver assistance system in June 2022, they counted only three fatalities that were clearly related to the technology. The latest figures include at least 17 fatal crashes, 11 of which occurred after May last year, and five that resulted in serious injuries.

Mitchell survived the crash in March but suffered a broken neck and broken leg and had to put on a ventilator. He still has memory problems and has difficulty walking. Aunt Dorothy Lynch said the crash should serve as a warning about the dangers of the technology. "I pray it's a learning process," Lynch said. "When it comes to machines, people shouldn't be gullible!"

Tesla CEO Elon Musk said Tesla cars are safer than cars driven entirely by human drivers when activating Autopilot, citing the odds of accidents in both driving modes. He is pushing automakers to develop and deploy more assisted driving features, including sheltered school buses, fire trucks, stop signs and pedestrians. He believes the driving technology will help create a safer, accident-free future. While it's impossible to say how many accidents might have been avoided, the data shows that real-time testing of the technology on U.S. highways shows that it has obvious flaws.

The media found that 17 fatal crashes related to Tesla vehicles revealed different crash patterns: four of them involved collisions with motorcycles and one involved collisions with ambulances. At the same time, many experts say that some of Musk's decisions appear to be responsible for the rise in accident reporting, such as broadening the availability of these features and removing radar sensors from vehicles.

Neither Tesla nor Musk responded to requests for comment.

The National Highway Traffic Safety Administration said a crash report involving driver-assisted driving systems does not by itself prove that the technology was the main cause of the accident. Veronica Morales, a spokeswoman for the agency, said: "NHTSA is actively investigating Tesla Autopilot, including its upgraded Full Self-Driving (FSD) system. NHTSA reminds the public that all advanced driver assistance systems require a human driver to be able to control the car at all times and concentrate on driving the task. Therefore, all state laws require human drivers to be responsible for the operation of vehicles. ”

Musk has repeatedly defended his decision to introduce driver-assisted driving technology to Tesla owners, arguing that the benefits outweigh the disadvantages.

With 736 cases and at least 17 deaths in four years, Tesla Autopilot's real car accident data has been revealed

He said last year: "Since you believe that adding assisted driving features can reduce casualties, I think you have a moral obligation to deploy it, even if you will be sued and blamed by a lot of people." Because those you've saved don't know their lives were saved, and those who have died or been injured in a handful of accidents, they certainly know, or their state knows. ”

Missy Cummings, a former senior safety adviser at the National Highway Traffic Safety Administration and a professor at George Mason University's School of Engineering and Computing, said the surge in Tesla accidents is troubling.

Responding to the data analyzed by the media, she said: "Tesla crashes are more serious and deadly than normal data sets. One possible reason is that over the past year and a half, the company's assisted driving technology has been widely promoted and can be used on the streets of cities and residential areas. In fact, anyone can own it. Is it reasonable to expect this to lead to an increase in the accident rate? Absolutely. ”

Cummings said the increase in fatalities compared to overall crashes is also a concern. It's unclear whether the National Highway Traffic Safety Administration data covers all accidents involving Tesla's driver-assisted driving system, but it includes multiple "unknown" accidents involving Autopilot or FSD.

After a 2021 federal order requiring automakers to disclose accidents involving driver-assisted driving technology, the National Highway Traffic Safety Administration began collecting data. The total number of crashes involving the technology is small compared to all road accidents. The agency estimates that more than 40,000 people died in various car accidents last year.

The data shows that since the introduction of the above reporting requirements, the vast majority of the 807 crashes related to assisted driving technology have been related to Tesla. Tesla has experimented more aggressively with assisted driving technology than other automakers, and it has also been linked to almost all fatalities.

Subaru ranked second, reporting 23 crashes since 2019. This huge gap may reflect the wider deployment and use of assisted driver technology by Tesla fleets, and Tesla's strong encouragement of drivers to use Autopilot.

Tesla FSD users increased nearly 40 times in 1 year

Tesla's Autopilot, introduced in 2014, includes features that enable cars to maintain a certain speed, keep a distance from other vehicles, and follow lane lines from the entrance ramp to the exit ramp on the highway. Tesla has made Autopilot standard on its cars, meaning more than 800,000 cars on U.S. roads are equipped with the feature, though iteration comes at a cost.

FSD, an experimental feature that customers must purchase to experience, allows Tesla vehicles to turn as indicated on the route, stop in front of stop signs and traffic lights, automatically turn and change lanes, and respond to hazards along the way. Tesla said that regardless of the system used, Tesla drivers must monitor the road situation at all times and intervene if necessary.

The rise in crashes comes as Tesla aggressively rolls out FSD technology, which has expanded from about 12,000 users to nearly 400,000 users over the past year or so. Nearly two-thirds of all driver-assisted driving accidents reported by Tesla to the National Highway Traffic Safety Administration occurred last year.

Philip Koopman, a professor at Carnegie Mellon University, has conducted research on the safety of self-driving cars for 25 years. "The widespread rollout of Tesla's technology raises key questions, and a clear rise in the number of crashes is certainly cause for concern," he said. We need to figure out if this is due to a more serious crash or some other factor, such as driving longer miles with Autopilot on. ”

In February, Tesla recalled more than 360,000 cars equipped with FSD systems over concerns that the software would cause cars to violate rules such as traffic lights, stop signs and speed limits.

Documents released by safety agencies say that if drivers do not intervene, violating traffic laws "may increase the risk of a collision." Tesla said it has addressed these issues with over-the-air software updates, addressing the risks remotely.

While continuously improving its driver-assisted driving software, Tesla has also taken unprecedented steps, such as removing radar sensors from new cars and disabling radar sensors on vehicles that are already on the road. Amid a global shortage of computer chips, Musk rolled out a simpler set of hardware that deprives the car of key sensors. Musk said last year that "only very high-resolution radar makes sense."

According to government documents seen in the media, Tesla has recently taken steps to reintroduce radar sensors.

In a demonstration in March, Tesla claimed that the odds of collision with FSD activated were at least five times lower than in a normal-driving vehicle in a mileage comparison per collision. Without detailed data that Tesla has, this claim, along with Musk's description of Autopilot as "absolutely safer," may not be verified.

Should self-driving technology be banned?

Autopilot is primarily a driver-assisted driving system enabled on highways that operates in a relatively less complex environment than the range of situations experienced by ordinary road users.

It's not clear which system was used in the fatal crash: Tesla has asked the National Highway Traffic Safety Administration not to disclose the information. In the data section of the agency's designated software version, Tesla's accident is marked in capital letters: redacted and may contain classified business information.

In recent years, both Autopilot and FSD have come under close scrutiny. U.S. Transportation Secretary Pete Buttigieg said last month that Autopilot wasn't the right name when it was said that you had your hands on the wheel and your eyes on the road all the time.

The National Highway Traffic Safety Administration has launched multiple investigations into Tesla's crash and other issues with driver-assisted driving software. One of the investigations focused on so-called "phantom braking", where a vehicle suddenly slows down due to a hazard that does not exist.

Last year, a Tesla Model S allegedly braked abruptly on the San Francisco Bay Bridge while using driver assistance systems, causing a series of collisions involving eight vehicles, injuring nine people, including a two-year-old child. In other complaints filed with the National Highway Traffic Safety Administration, car owners said the cars also braked hard when they encountered oncoming trucks.

Many car accidents involve similar settings and conditions. For example, the National Highway Traffic Safety Administration received more than a dozen emergency accidents in which Tesla crashed into a parked vehicle in activated Autopilot mode. Last year, the agency upgraded its investigation into the accidents to "engineering analysis," meaning it could force Tesla to carry out a recall.

Also last year, the National Highway Traffic Safety Administration opened two consecutive special investigations into fatal accidents involving Tesla cars and motorcyclists. One of those accidents occurred in Utah, when a rider on a Harley-Davidson motorcycle was driving on Interstate 15 outside Salt Lake City when a Tesla with Autopilot activated hit him from behind.

The Utah Department of Public Safety said: "The driver of the Tesla vehicle did not see the person riding the motorcycle, collided with the back of the motorcycle, and the motorcycle rider was thrown out and died instantly. "It's very dangerous for motorcycles to drive around Tesla cars. ”

With 736 cases and at least 17 deaths in four years, Tesla Autopilot's real car accident data has been revealed

Of the hundreds of accidents involving Tesla's driver-assist system, the National Highway Traffic Safety Administration focused on about 40 and conducted further analysis in hopes of gaining a deeper understanding of how the technology works. Among them was a car crash in North Carolina when Mitchell was walking off a school bus.

According to North Carolina Highway Patrolman Marcus Bethea, Tesla driver Howard G. Yee was charged with multiple counts in the crash, including reckless driving, overtaking and hitting someone, a first-degree felony. The investigation found that Howard fixed a heavy object to the steering wheel of a Tesla car to trick Autopilot into believing that the driver's hand was on the steering wheel. Autopilot is set to disable the function if the driver does not apply steering pressure for a long time.

The National Highway Traffic Safety Administration is still investigating the crash, and a spokesman for the agency declined to provide more details, citing an ongoing investigation. Tesla asked the agency to keep the company's summary of the accident out of the public eye, saying it could contain confidential business information.

Lynch said her family was always watching Howard and believed his actions were a mistake caused by excessive trust in the technology, which experts call "automated complacency." Lynch said: "We don't want his life to be ruined because of this stupid accident. ”

But when asked what he thought of Musk, Lynch's words were more poignant. She said: "I think he needs to ban so-called autonomous driving! This technique should also be banned! (Golden Deer)

Read on