laitimes

Two more defect investigations NHTSA and Tesla "conflict" continues to escalate

Recently, the National Highway Traffic Safety Administration (NHTSA) launched two formal defect investigations, and its current review of Tesla Autopilot is unprecedented. Tesla has long used "light touch" to adjust self-driving technology to test the bottom line of policy. But judging by the series of initiatives NHTSA has taken over the past 10 months, this may not last long.

"We all admire his foresight," Said Heidi King, the former acting head of NHTSA, of Tesla CEO Musk, "but his exaggeration of consumer products can be very dangerous." David Friedman said, "Tesla is like a thorn in the head, and it has been like this for many years." ”

In the past four years, Tesla vehicles have had more than a dozen collisions using the Driver Assistance System (Autopilot), raising questions about the safety of the technology. Although Tesla sees the technology as the "crown jewel," the current scrutiny of Autopilot by U.S. regulators is unprecedented.

Two more defect investigations NHTSA and Tesla "conflict" continues to escalate

Recently, the National Highway Traffic Safety Administration (NHTSA) launched two formal defect investigations, which may eventually lead to Tesla having to update the car and restrict it when Autopilot is still not safe to use. In response, NHTSA even has the right to request a mandatory recall of the car.

The National Transportation Safety Board (NTSB) has repeatedly called for greater regulation of autonomous vehicles and recommended that Tesla adopt the safety measures of the self-driving systems used by GM and Ford. Tesla did not respond to the NTSB's recommendations and instead continued its "high-risk" approach.

Because tightening regulation of Autopilot means that Tesla's reputation in the minds of consumers could be damaged and have an impact on investors. Investors believe the company's prospects for self-driving have made Tesla CEO Elon Musk the world's richest man.

"Touch" the bottom line of the policy

Tesla has long used "light touch" to adjust self-driving technology to test the bottom line of policy. In October 2015, Tesla released a software update that made autonomous driving possible. A few days later, users posted videos ignoring the company's warnings prohibiting the handle from leaving the steering wheel, with one car almost automatically driving off the road and the other nearly crashing into an oncoming car.

In May 2016, a Tesla owner in Florida crashed into a trailer while driving a Model S using autopilot. Weeks after the accident, Then-Transportation Secretary Anthony Foxx said NHTSA would issue guidelines, not regulations, on self-driving technology.

In March, when asked when European car owners would be able to test full autopilot (FSD), Musk cited a loosening of regulation. "In the U.S., everything is legal by default," Musk said, "in Europe, [fully self-driving testing] defaults to illegal, so we have to get approval beforehand; in the U.S., it's more or less possible to do it on your own perception." ”

Tesla's approach to self-driving stands in stark contrast to traditional automakers General Motors and Ford, both of which have installed cameras behind the wheel to monitor whether drivers are concentrating and limit system use to highways that engineers have already mapped and tested.

David Friedman, who served as NHTSA's deputy director and acting director from 2013 to 2015, said: "Tesla is like a 'thorn in the head' and has been like that for many years."

NHTSA has repeatedly reminded the public that no commercial vehicle can drive itself. The agency has launched 31 special investigations into accidents involving driver assistance systems, 24 of which involved Tesla. But Tesla has been peddling FSDs and charging a fee of $12,000.

"We all admire his foresight," Heidi King, the former acting head of NHTSA, said of Musk, "but his exaggeration of consumer goods can be very dangerous." ”

Increasing scrutiny

During a five-year leadership vacuum, Heidi King was one of several acting principals at NHTSA, whose last chief executive left in January 2017. The lack of leaders, combined with tight budgets and limited staff, has extended Autopilot's "free rider" time to some extent. But a series of initiatives taken by NHTSA over the past 10 months suggest that this may not last long.

Last June, NHTSA asked automakers to report crashes in which autopilot systems were activated; in August, NHTSA launched a defect investigation related to the accident site; in September, NHTSA asked Tesla and more than a dozen competitors to provide documents about the automated system; in October, NHTSA asked Tesla why it did not conduct a recall when deploying software updates to improve emergency vehicle inspections, and sought information about expanding the availability of FSDs; in November, Tesla recalled an FSD.

In February, Tesla conducted another FSD-related recall and disabled settings that allow vehicles to pass through stop signs. At the same time, NHTSA also launched a second investigation into the defects of the autonomous driving system.

Former safety officials were encouraged by Autopilot's increasing scrutiny, calling on NHTSA to exercise recall authority and seek more powers and resources to enable it to become a safety standard.

A spokesman for the agency said: "NHTSA has powerful weapons and powers to protect the public, investigate potential safety issues, and force recalls when evidence of breaches or unreasonable security risks is found. NHTSA has collected data, conducted studies, developed test procedures, and measured their effectiveness, all of which are necessary requirements before safety standards are developed. ”

Different forms of recall

If NHTSA determines from one of these investigations that Autopilot is flawed, it can order Tesla to carry out a recall. However, Tesla may take a different form to enforce the recall order, as this is permitted by law.

Tesla's fixing of the defect may be as simple as updating it via an OTA approach. Tesla has previously conducted several recalls in this way and updated its autopilot system software to prevent the system from operating in areas that are not yet safe to navigate.

Ultimately, though, Tesla may need to take on a more costly solution. Tesla, for example, might want to decide whether it needs to install cameras behind the steering wheel to monitor whether drivers are concentrating while using its systems, as other automakers do.

Manufacturers' third option for remediation of forced recalls is refunds, which would also be the most costly option. Tesla has steadily raised the price of FSDs before and charged thousands of dollars before making autopilot standard in 2019. Friedman said tesla would blame itself if NHTSA took action on autopilot.

"Since the accident in 2016, the NTSB has been accusing Tesla of not being able to see the side of a big truck, there is a serious problem," Friedman said in an interview. The first thing you learn in driving training is: If you have an emergency vehicle, don't hit it. ”

More than five years ago, when NHTSA first investigated whether the self-driving system was flawed, it found that the Tesla Model S driver who hit a trailer ignored warnings to stay in control. NHTSA said in a report that no defects were found and that it would close its investigation. Data provided by Tesla shows that crash rates dropped by nearly 40 percent after installing autopilot features, the report said.

Two years later, a data analytics company released a report and questioned it. Quality Control Systems, inc. has sued the U.S. Department of Transportation for mileage and crash data from the NHTSA study, which the company found incomplete and criticized safety claims made by the company and regulators as "unreliable."

"You shouldn't just believe Tesla's words. NHTSA had a responsibility to conduct high-quality analysis and do everything in its preparation, but they didn't seem to be doing it. Friedman said. An agency spokesperson also said NHTSA did not issue a statement in the report on the effectiveness of autonomous driving and lacked key information. (China Economic Network Jiang Zhiwen/Compilation)

Read on