laitimes

An Apple Chinese engineer who died in a Tesla car accident

author:World Advanced Manufacturing Technology Forum

According to media reports on the 9th, there was a high-profile legal action scheduled this week, and Tesla was supposed to appear in court on the safety of its Autopilot autonomous driving assistance system (Autopilot) in response to a case in which the system was accused of causing the death of the driver. Tesla, however, chose to reach a settlement agreement with the victim's family, Walter Huang, a Chinese-American engineer at Apple, before the trial.

The settlement agreement has not been disclosed to the public, and Tesla has asked that the exact amount of the settlement agreement be kept secret to prevent other potential claimants from taking the amount as evidence of Tesla's possible liability in similar cases.

An Apple Chinese engineer who died in a Tesla car accident

▲The scene of the car accident (Image source: Red Star News)

An Apple Chinese engineer who died in a Tesla car accident

An Apple engineer was killed in a car accident driving a Model X

The automated driving assistance system was used for nearly 19 minutes

On March 23, 2018, Huang Weilun drove a Tesla Model X on the highway near Mountain View, California, and crashed into a highway barrier and died. The National Transportation Safety Board (NTSB) investigation revealed that the autopilot assistance system was used for nearly 19 minutes before the crash, when the car veered off the highway at 71 miles per hour.

An Apple Chinese engineer who died in a Tesla car accident

▲ The scene of the car accident

The NTSB also revealed in a 2020 investigation that Tesla's self-driving assistance system and road construction issues such as faded lane markings and obstacle (or collision attenuator) positioning were part of the factors contributing to the crash.

But Wong had been playing with his phone before the crash, and the driver himself was also responsible. Tesla's lawyers have argued that Mr. Wong was a negligent driver who apparently knew what he was doing at the time of the crash while still playing a mobile game. "It is indisputable that if he had been keeping an eye on the road, he would have had a chance to avoid this crash," it was mentioned in a court document. ”

And the lawyer for Huang's family pointed out that Tesla's over-touted ability to assist its self-driving technology is actually not as safe as Tesla advertises. His lawyer noted in court filings that Tesla CEO Elon Musk had posted on social media suggesting that the self-driving system allows Tesla vehicles to be driven safely without having to keep an eye on the road or keep your hands on the steering wheel of the vehicle. In addition, the lawyer raised questions about whether Tesla understood that drivers might not or could not use the system as directed, and what measures the automaker had taken to protect them.

Although Huang's family acknowledged that he was distracted while the car was driving, they believe that Tesla still mistakenly advertised Autopilot as self-driving software despite knowing that there was a safety flaw. As a result, Huang's family filed a wrongful death lawsuit against Tesla, and the claim focused in part on the safety and design flaws of Tesla's self-driving assistance system.

An Apple Chinese engineer who died in a Tesla car accident

Tesla:

The driving process requires a driver with full attention

Tesla's self-driving technology has been under intense scrutiny since Wong's car accident. The National Highway Traffic Safety Administration and the National Transportation Safety Board have also been investigating collisions involving Tesla vehicles using a variety of driver assistance features, including a series of emergency vehicle collisions that occurred at other accident sites.

An Apple Chinese engineer who died in a Tesla car accident

Image source: Visual China (data map)

After two years of investigations and analysis of at least 956 accidents, the agency has launched more than 40 separate investigations into accidents involving Tesla's autopilot system, which have resulted in 23 deaths.

The National Highway Traffic Safety Administration says autonomous driving systems can give drivers a false sense of security. In its December 2023 investigation report, it said that in certain dangerous situations, when the autonomous driving system may not be able to navigate safely, it can be easily abused. Immediately after the report's release, Tesla recalled all 2 million vehicles in the U.S. and issued warning commands to drivers when the autopilot system was activated, such as paying attention to the road or putting their hands on the steering wheel.

The accident has sparked widespread discussion about the safety and reliability of autonomous driving systems. Tesla is facing lawsuits and investigations over the collision of its Self-Driving and Full Self-Driving assistance systems, blaming the inattentive driver. Autonomous driving systems can steer, accelerate, and brake on their own on open roads, but they can't completely replace humans, especially in urban driving. Tesla explained that it will not make the car autonomous, and that the driving process requires a "dedicated driver" who can "take over at any time."

Read on