laitimes

Tesla recreates the fatal accident, and the G7, the largest taxi company in Paris, withdraws all Model 3s

The G7, the largest taxi company in Paris, announced on Tuesday (December 14) that it would withdraw all 37 Tesla Model 3s from the company's service fleet and temporarily stop using the car.

The announcement came from a fatal Tesla accident a few days ago.

On the night of December 11, a Tesla Model 3 taxi from G7 suddenly lost control while driving. According to French media reports, the accident site was located in the more lively 13th arrondissement of Paris, and the car first hit a cyclist and three pedestrians, and then collided with a van. The accident killed 1 passerby and injured 20 others, and there were no casualties among Tesla's drivers and passengers.

G7 deputy CEO Yann Ricordel explained that on the day of the incident, a taxi driver who was off duty at the company drove the Model 3 to a restaurant with his family, and the driver on the road tried to brake, but unexpectedly the car accelerated. While it's unclear if the car used Autopilot at the time, the company suspects tesla cars have technical flaws.

In this regard, Tesla said that they remotely retrieved the sensor and camera data of the crashed vehicle, and after the preliminary investigation, the technical fault of the vehicle has been ruled out, which is similar to Tesla's statement in previous accidents. However, at the request of Agence France-Presse, Tesla said it would actively cooperate with the follow-up investigation by the Paris authorities.

"There's nothing to convince people that it's related to a technical issue." French Transport Minister Jean-Baptiste Djebbari said on Dec. 15 that there was no indication at this stage that the fatal accident was caused by a technical failure of the Tesla Model 3. Djebbari told RMC Radio (Monte Carlo Radio) that he is not worried about the accident now, while the G7 wants to suspend all Tesla Model 3 services until the police investigation is out, saying the company's 50 Model S business is not affected.

Coincidentally, since 2016, the National Highway Traffic Safety Administration (NHTSA) has conducted 33 investigations into Tesla accidents, 11 of which have been suspected of using the Autopilot system.

On July 26 this year, a Tesla Model Y in the United States, while using the Autopilot system, knocked down Jean Louis, a 52-year-old man who was changing the tires of a faulty car on the side of the road, killing him. Since then, NHTSA has also been investigating 11 accidents in which Tesla electric cars crashed into parked ambulances between January 2018 and July 2021. Authorities said 17 people were injured and 1 died in the accidents, and all affected Tesla vehicles, without exception, enabled Autopilot.

Based on the above cases, Tesla's treatment of the Autopilot system has increasingly become the focus of NHTSA's attention. In a letter to Tesla in October, NHTSA mentioned that automakers are obligated to recall the car within five days if safety-related issues arise, and Tesla needs to explain why it didn't do so by Nov. 1.

Tesla's response is that Autopilot is just a driver assistance system that keeps the lane and the car away from the vehicle in front of it, but the driver must always keep his hands on the steering wheel and keep an eye on the traffic.

Although Tesla did not explain why vehicles using Autopilot failed, it is clear that there are more and more car accidents caused by Tesla Autopilot, and Tesla's earlier statement about Autopilot was not the case. Initially, Tesla's Chinese translation of Autopilot is not the current "automatic assisted driving", but "automatic driving", and has been promoting it as an independent function, and later changed its name because of the world's first Tesla autopilot fatal car accident that occurred in China in 2016.

On January 20, 2016, a Tesla sedan in the Handan section of the Beijing-Hong Kong-Macao Expressway in Hebei Province directly hit a road sweeper that was working, and the Tesla sedan was damaged on the spot, and the driver Gao Gaoning unfortunately died.

The crash was initially judged by traffic police to be primarily responsible for Gao Gaoning, and the images in the dashcam showed that the Tesla Autopilot system gaoning was using at the time failed to recognize the road sweeper, so it did not react.

After more than a year of litigation by Gao Junin, Gao Junin' father, on the afternoon of February 27, 2018, Tesla had to admit in front of a large amount of evidence that the accident vehicle was enabled by imperfect "automatic driving" at that time, and renamed "automatic driving" to "automatic assisted driving".

Read on