laitimes

Is Tesla Autopilot safe at all?

author:Bangning Studio

"The focus has always been on innovation, not safety"

Is Tesla Autopilot safe at all?
Is Tesla Autopilot safe at all?

Compile the | Yang Yuke

Edit | Jane

Produced by | Bangning Studio (gbngzs)

When he bought the Tesla Model S in 2017, Robin Geoulla actually had some concerns, and he had some doubts about the car's self-driving technology.

"It's kind of scary to rely on it completely and let it drive on its own, you know." Robin described to a U.S. investigator his initial feelings about Tesla's Autopilot system. In January 2018, a few days before the above sentiment was posted, Robin's Model S hit the rear of a fire truck parked on an interstate in California while autopilot was activated.

Over time, Robin changed his skeptical attitude toward Autopilot. He found that in general, Autopilot is basically reliable when tracking the vehicle in front of it, but when faced with direct sunlight or the vehicle in front of it suddenly changes lanes, the Autopilot system seems to be in a chaotic situation. When asked by national Transportation Safety Board (NTSB) investigators, Robin replied.

He told investigators he was heading toward the sun before rear-ending with the fire truck — which seemed to confirm that view. The NTSB found that the Tesla Autopilot design allowed Robin to get out of the wheel while on the move, and when autopilot started, his hand was bare at the wheel for almost 30 minutes.

The NTSB has previously urged the National Highway Traffic Safety Administration (NHTSA) to investigate Autopilot's limitations, the possibility of driver abuse, and a range of possible safety risks following a series of accidents involving Autopilot, some of which are fatal.

Jennifer Homendy, the new chairman of the NTSB, told Reuters: "The facts show that the focus has always been on innovation, not security. I hope that the moment is the beginning of a transformation. She said Tesla Autopilot is not comparable to the stricter self-driving systems used in aviation, which have trained pilots and match rules for dealing with fatigue, drug and alcohol testing.

According to Tesla's website, Autopilot is an advanced driver assistance feature that cannot be achieved in the current version to achieve autonomous driving. Before activating the system, the driver must put his hands on the steering wheel and maintain control of the vehicle.

Is Tesla Autopilot safe at all?

Robin's accident is one of 12 cases NHTSA is investigating (including the September 13 Florida accident) involving Autopilot, part of the agency's most in-depth investigation since Tesla introduced the Autopilot system in 2015.

A Reuters review of NHTSA statements, NTSB documents and police reports shows that most of the accidents under investigation occurred after dark or under limited visibility conditions such as harsh sunlight. Autonomous driving experts say this raises questions about Autopilot's ability to adapt to special driving conditions.

An NHTSA spokesperson wrote in a statement sent to Reuters: "We will take action when we find that there is an unreasonable risk to public safety." ”

▍ Investigate Autopilot

The investigation has already begun.

Since 2016, the U.S. auto safety regulator has successively arranged 33 special accident investigation teams to investigate Tesla accidents suspected of using advanced driver assistance systems. NHTSA has ruled out the possibility of using Autopilot in 3 of these non-fatal accidents.

NHTSA's current investigation of Autopilot has revived the question of whether the Autopilot system is secure or not? It's also a major challenge for Tesla CEO Elon Musk.

For advanced driver assistance features like lane changes, Tesla charged customers $10,000 and promised to eventually drive itself using only cameras and advanced software. Other automakers and self-driving companies use not only cameras, but also more expensive hardware, including radar and lidar.

Musk has said that Teslas equipped with 8 cameras are safer than human driving. But experts and industry executives say camera technology is subject to darkness, sunlight and harsh weather conditions such as heavy rain, snow and fog.

Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University, believes that computer vision is far from perfect today and will be the same for the foreseeable future.

Is Tesla Autopilot safe at all?

In 2016, Tesla had the first fatal accident involving Autopilot in the United States in West Williston, Florida. Under the bright sky, neither the driver nor Autopilot could identify the white baffle of a heavy truck. Instead of braking, Tesla crashed straight into the heavy truck.

Reuters reviewed documents showing that NHTSA ended its investigation into the fatal accident in January 2017 and, after some controversial exchanges with Tesla-related personnel, found that Autopilot's performance was not flawed.

In December 2016, as part of an investigation, the regulator asked Tesla to provide details of any internal safety concerns raised by Autopilot, including the possibility of misuse or abuse by drivers, according to a special order issued to Tesla by regulators.

An NHTSA lawyer found Tesla's initial response inadequate. Todd Maron, Tesla's general counsel at the time, added again, telling regulators that the demand was "too broad" and that he could not list all the issues that Autopilot had involved in the development process.

Still, Malone believes Tesla has been cooperative throughout. During the autopilot development process, Tesla employees have spoken about their concerns — Tesla occasionally brakes fail or accelerate, or steering failures, and certain misuse and abuse by drivers, but didn't provide more details.

The NHTSA documents show that regulators want to know how Tesla identifies flashing lights on emergency vehicles or detects the presence of fire trucks, ambulances and police cars on the road. The agency obtained similar information from 12 competitors. "Tesla is required to generate and validate the data, and their interpretation of that data, and NHTSA will independently verify and analyze all the information."

Is Tesla Autopilot safe at all?

As an electric vehicle pioneer, Musk has struggled to defend Autopilot in front of critics and regulators. Moreover, Tesla also uses Autopilot to wirelessly update car software, surpassing and circumventing the traditional car recall process.

Musk has repeatedly promoted the Autopilot feature. In the eyes of some critics, some publicity methods will mislead consumers, who believe that Tesla can do self-driving — instead, the owner's manual tells drivers to stay focused and explains the limitations of the technology.

"In order to sell cars, some manufacturers will do what they want to do according to their position, which requires government regulatory control," Homandi said. ”

▍ 2 seconds dividing line

Just as federal officials are reviewing Tesla's Autopilot safety, there is a study showing how the system affects driver behavior.

According to MIT researchers, when the Autopilot system is turned on, the driver's eyes deviate from the road more often and for longer periods of time than pure human driving.

The study is believed to be the first to use real-world driving data to measure Autopilot drivers' concentration and exactly which side of the road ahead is.

Pnina Gershon, a research scientist at the Massachusetts Institute of Technology and one of the study's authors, said: "This is the first time we have quantified the impact of Autopilot on driver attention. Essentially, the data shows that when Autopilot is enabled, we see that drivers' gaze is off the road for longer. ”

Is Tesla Autopilot safe at all?

NHTSA is investigating 11 accidents, multiple of which involved an Autopilot vehicle colliding with a stopped emergency vehicle. Tesla must submit written responses and data required by federal regulators by October 22, 2021.

This investigation is just a small part of Autopilot's broader security concerns. In its investigation into multiple fatal crashes involving Autopilot vehicles, the NTSB said driver regulation was inadequate and Autopilot lacked safety. The NTSB warns that such a Level 2 system could lead to over-reliance on humans, known as "automated complacency."

Tesla-related abuses are well documented on YouTube. Records show drivers reading newspapers or sitting directly in the back seat, showing blatant disregard for driving tasks.

MIT researchers tried to quantify attention levels by analyzing Tesla drivers manually dismissing the Autopilot system and re-controlling the look in the vehicle during the period.

The researchers looked at 290 instances of this switch through the in-car camera, distinguishing between the driving task-related gaze towards the rearview mirror and dashboard, and the driving-independent gaze, such as downward and toward the central area. The study found that in Autopilot mode, there are fewer eyes associated with driving than in purely human drivers. There was an increase in the number of non-driving gazes, with 22% glancing for more than 2 seconds, compared with 4% for purely human drivers.

This 2-second dividing line is important. To prevent distracted driving, NHTSA recommends that engineers design a system that ensures the driver's line of sight is not more than 2 seconds away from the road. Gessen revealed that several non-driver glances of more than 5 seconds were found during the study.

Whether the driver safely takes their eyes off when driver assistance systems handle driving tasks is an important question for Tesla and other automakers to consider. The findings suggest that it's time to consider this.

Is Tesla Autopilot safe at all?

In 2015, Tesla launched the first generation of Autopilot. In 2017, GM launched the Super Cruise Driver Assistance System. Since then, many automakers have followed suit with more advanced features.

The latest research shows that while these Level 2 systems have been on the road for years, automakers don't take the comparison into account how well drivers use them or some of them.

Gessen worries about the isolated nature of system development and how these systems affect the humans who operate the vehicles. "If manufacturers continue to go it alone, we will eventually find that two vehicles are driving side by side on the road, but their development is based on different design concepts and policy approaches, and they cannot work together effectively in one ecosystem." "So we need to come up with policies about what information the system designer should transmit to the consumer, and the consumer makes some driving decisions based on the intuitive information," Gessen said. ”

▍ Solve the security problem first

The U.S. chief accident investigator urged Tesla to address safety concerns before expanding its self-driving capabilities.

In an interview with the Wall Street Journal, Hormandy said the term "fully autonomous driving" was "misleading and irresponsible." The Wall Street Journal believes that marketing may attract more attention than the warnings in car manuals.

On Aug. 13, after taking the oath of office, Hormandy said in his first interview with Bloomberg: "Whether it is Tesla or other companies, these manufacturers have a responsibility to honestly say what their technology can and cannot do." Then, she praised Tesla's cooperation in many previous NTSB investigations, saying that she did not want to take Tesla out alone.

For example, she said, TV commercials for various cars give the false impression that cars can steer and brake automatically. "I was stunned." Homandi said.

Is Tesla Autopilot safe at all?

Addressing the safety of these driver assistance systems is one of the "long list" operations planned by Homandi. "We have a great past, but we have to look to the future." "We're in an era of transformational change, and I've heard a lot about innovation and investing, but I haven't heard much about security," she said. That's the entry point, and security has to be the focus. ”

Despite the Autopilot name (literally translated as autonomous driving in English) and Musk's elaborate articulation, the truth is that Autopilot does not technically have the ability to "fully autonomously drive." The constant occurrence of accidents and lost lives continues to attest to this.

"NHTSA reminds the public that there is not a single motor vehicle on the market that can drive itself." A spokesman for the agency said in an investigative statement.

It's unclear whether NHTSA can correct Tesla's behavior. The spokesperson added that all state laws require human drivers to be responsible for vehicle operations.

With so much danger stemming from Tesla's over-hype about the capabilities of its driver assistance systems, more effective constraints could come from entities like the Federal Trade Commission. Last week, two senators called on the FTC to investigate whether Tesla was using fraudulent marketing to put the public at risk.

On August 13, the U.S. auto safety regulator launched a preliminary investigation into Tesla Autopilot. According to NHTSA filings, the investigation involved Model Y, Model S, Model X and Model 3 models 2014 to 2021, with an estimated 765,000 vehicles affected.

The agency's ODI office said 11 accidents have been confirmed since January 2018. In these accidents, various Tesla models drove near the scene of the first reaction and subsequently crashed into one or more cars at the scene. In addition to 11 accidents, the agency's report mentions 17 injuries and 1 death.

(Part of this article is reported by Automotive News, Reuters, bloomberg, and some of the pictures are from the Internet)