laitimes

Musk grievances Tesla Autopilot was not praised for saving lives and was instead blamed

Musk grievances Tesla Autopilot was not praised for saving lives and was instead blamed

Tesla CEO Elon Musk said: "When we started developing self-driving technology, someone said to me: Even if you save 90% of your life, the 10% of the people you don't save will condemn you. ”

Musk went on to say, "I think it's the kind of thing where you save lives and don't necessarily get rewarded, but you're definitely going to be blamed for not saving lives." ”

Musk's comments come as Tesla's self-driving technology is facing legal issues. The technology has been linked to 12 accidents since 2018. Some Tesla drivers are filing lawsuits over fatal crashes, while others seek legal action against the company over alleged misrepresentation and deceptive marketing of its driver-assisted driving system, Autopilot, and its upgraded Full Autopilot (FDS) service.

Not long ago, Musk believed that he did not mislead Tesla owners with self-driving technology, nor did he risk the safety of Tesla owners. "I don't think there's any other CEO in the world who cares more about security than I do," he said. ”

Musk explained that he pushed hard to develop Autopilot because the system could improve driving efficiency and save millions of lives each year. When activated, Tesla's self-driving technology is designed to monitor its surroundings, keeping the car in the center of the lane and at a safe distance from other cars.

However, Tesla acknowledges that Autopilot won't let cars drive themselves, and still requires the driver's full attention to avoid accidents.

Tesla said in the Vehicle Safety Report that in the second quarter of 2021, Tesla cars using Autopilot technology will have a crash for every 7.09 million kilometers traveled, while Tesla cars without Autopilot technology will have a crash for every 1.93 million kilometers.

In August, the National Highway Traffic Safety Administration (NHTSA) began investigating 765,000 Tesla vehicles produced since 2014 after FSD software caused 11 traffic accidents involving emergency vehicles, killing 1 person and injuring 17 others.

In a letter sent to Tesla in September, NHTSA asked the company for more information about the confidentiality agreement it had with the owner and asked Tesla to recall the vehicle if it needed a software update to fix a safety flaw. Tesla updated Autopilot shortly after the investigation began in an attempt to address these issues.

Read on