laitimes

Tesla is "deeply involved" in the recall/regulatory investigation, and the "Takata airbag door" is a lesson for the past

For assisted driving and autonomous driving, the biggest variables are safety incidents and regulatory recalls.

As local regulators in the United States ramp up scrutiny of Tesla's driver assistance technology, the company is also being brought under scrutiny by German regulators. "The German Federal Motor Vehicle Office is investigating Tesla's auto-lane changing feature and whether it is approved for use in Europe."

This is not the first time that the relevant German regulators have launched an investigation into Tesla. Two years ago, a German court ruled that Tesla had misled consumers about promoting assisted driving capabilities in new cars.

However, the aggressive Tesla is destined to be the focus of the regulator.

Last week, the National Highway Traffic Safety Administration (NHTSA) officially disclosed that it is currently conducting a second investigation into possible defects in Tesla's assisted driving system (about "ghost brakes"), involving about 416,000 Tesla Model 3/Ys produced from 2021 to 2022.

Before the investigation was launched, Tesla had already launched a series of recalls. Two weeks ago, Tesla announced its 11th recall in about four months in the United States. And in August last year, after multiple collisions with police cars and fire trucks, NHTSA began investigating Autopilot for functional defects.

One

Early last year, NHTSA conducted its first investigation into Tesla's "ghost brakes," but the results were: More than 200 cases of sudden acceleration and crashes of different Tesla models were due to driver error, not problems with the driver assistance system.

However, the series of subsequent recalls are quite intriguing.

In June last year, Tesla announced a recall plan in the Chinese market, involving a total of 285,500 imported Model 3s and domestic Model3 and Model Y. The reason is that due to the problem of active cruise control, it is easy for the driver to mistakenly activate the active cruise function in the following situations.

At the same time, Tesla denied the problem of brake failure, saying in a company statement: "There has never been a single brake failure at present, and there have been no cases of collisions or injuries caused by this situation."

Three months later, Tesla announced a recall in the U.S. of about 11,700 vehicles sold between 2017 and 2021, namely the Model S, Model 3, Model X, and Model Y for model 2020 to 2021. The reason for the recall was "Forward Collision Warning or Automatic Emergency Braking (AEB) System Accidental Activation" that could result from a communication error.

That is, the unexpected start of AEB may cause the car to brake suddenly, increasing the risk of crash. In fact, for the "ghost brake" problem complained by users, Tesla CEO Elon Musk publicly stated last year that the company is upgrading the existing software version to eliminate the "ghost brake" problem that has always existed.

Earlier this year, Tesla again announced a recall (involving about 53,822 vehicles) in the U.S. market for a feature in fully automated driving (FSD) software that previously allowed the system to ignore stop signs at intersections and pass at a lower speed.

Interestingly, in Tesla's owner's manual, it is written, "In intersections without traffic light control and straight roads at T-junctions, in the case of autopilot traffic lights and stop sign control, the vehicle has priority by default and will not slow down or stop." ”

Ultimately Tesla's solution is this: As with all auto-assisted driving features, you (the driver) must continue to pay attention to the road ahead and be prepared to take immediate action, including braking, as this feature may not work in all scenarios.

A few years ago, Mobileye announced the termination of its cooperation with Tesla, and one of the triggers was Tesla's assisted driving fatal accident that shocked the world at that time. "It's not enough to just tell drivers to be vigilant, but also to tell them why they're doing it."

Some testing agencies, as well as automakers, say that in the vast majority of cases, these systems that have been mass-produced on the bus are effective, but not perfect. "Just look at the owner's manual of each house, you can see that the normal operation of the system needs to meet many conditions, and there are also many failure conditions."

In fact, in the owner's manual, Tesla has already noted a series of restrictions to clear up its own problems.

For example, active cruise control cannot detect all obstacles, especially if the vehicle is above 80 km/h, or if the vehicle or obstacle is only partially in the driving lane, or if a stationary or slow vehicle or obstacle appears in front of the vehicle after the vehicle leaves your lane, active cruise control may not brake/decelerate.

Conversely, active cruise control may react to vehicles or objects that do not exist or are not in the lane of travel, resulting in unnecessary or improper deceleration of the vehicle.

Tesla is "deeply involved" in the recall/regulatory investigation, and the "Takata airbag door" is a lesson for the past

At the same time, for traffic lights and stop sign control, the owner's manual also states that this is a test function, and the effect is best only on the roads where the vehicle is frequently driven. When any traffic signal, including a green light, is detected, the traffic signal and stop sign control will attempt to stop.

Tesla warns that sometimes traffic lights and stop sign controls can detect traffic lights or parking signs that are inaccurate and cause vehicles to decelerate unexpectedly. This is also one of the typical "ghost brake" problems.

Previously, Tesla owners' complaints to NHTSA mainly focused on automatic braking at inappropriate times (including when driving on the highway) and mistakenly believing that there was an obstacle ahead.

Tesla is "deeply involved" in the recall/regulatory investigation, and the "Takata airbag door" is a lesson for the past

According to CARIAD, a software company owned by the Volkswagen Group, the SOTIF standard (safety of expected functions) is very important for autonomous driving, especially when it comes to decisions involving complex algorithms and artificial intelligence. This was largely absent from the development of many high-end automatic driver assistance functions before.

"We have to cover all situations where accidents can happen even for the most experienced drivers, but the system has to address those issues. Being able to predict and interpret the behavior of other road users is especially important. In the view of the above-mentioned person in charge, software is not a big problem, and ensuring system security is the most complicated.

Two

In addition to the after-the-fact investigation, regulators are also stepping up to "guide" consumers to cautiously purchase and use unsafe assisted driving systems, while strengthening "prior restraint" on production enterprises.

Earlier this year, the Association for Highway Safety and Insurance (IIHS) officially released a new rating system for assisted driving systems, explicitly requiring new cars equipped with related systems to take safety measures to help drivers stay focused, rather than treating semi-automated systems like Autopilot as autonomous vehicles.

The new rating will be set to four gears, with the highest rating requiring the system to monitor whether the driver is putting his hand on the steering wheel and paying attention to the road ahead (if the driver fails to react correctly, the vehicle should slow down or stop safely). In addition, drivers are required to initiate automatic lane changes.

According to the relevant policies of intelligent and connected vehicle manufacturers and product access management previously released in the Chinese market, it is also clearly proposed to strengthen the safety management of autonomous driving functional products. For example, it should be possible to automatically identify the failure of the autonomous driving system and whether the design operating conditions are continuously met, and to take risk mitigation measures to achieve a minimum risk state.

Tesla has been pre-installing in-cabin cameras for drivers and rear passengers (located in the inner mirror position) since the Model 3, but the company's owner's manual says these cameras are not used to monitor drivers.

In the context of continuous accidents, regulators have begun to release pressure on Tesla to increase strong driver monitoring mechanisms, such as vision-based monitoring systems that some car companies have already enabled.

In addition, with the implementation of new regulations in Europe's NCAP, vision-based driver monitoring will become a necessary condition for new cars to receive a five-star rating. According to the plan, from June 2022, all new cars entering the EU market will have to be equipped with similar safety systems, including driver sleepiness and attention warnings.

The greater risk may also come from the security assessment of the system itself. At present, there is no regulator in the world that has set strict and demanding access conditions in key areas including perception, calculation, planning and decision-making.

"Even with enough data, the current camera performance is not enough," valeo CEO said, adding that safe and redundant sensors are particularly important in complex scenarios where people and vehicles are mixed. "We strongly believe that lidar is essential to achieve a further level of autonomy."

For now, Tesla's bet is to train camera perception algorithms by collecting large amounts of data. At the same time, because a large number of new cars with pre-installed FSD (hardware) functions have been delivered to consumers, the subsequent addition of safety redundancy hardware such as lidar is a huge expense for Tesla.

"Cameras can do a lot of things in perfect road and weather conditions," and to many people, "the problem is precisely in a lot of edge scenes, such as camera visibility is susceptible to light, weather, and a lot of uncommon objects."

According to industry insiders, the only practical way to meet the redundancy requirements is to supplement other types of sensors to compensate for the camera's natural deficiencies while providing perceived redundancy for possible camera failures. The performance level of the sensor basically determines the ODD (system design operating domain) of assisted driving and automatic driving.

"On L2+ level auto-assisted driving, lidar will improve the time to get out of hand in high-speed scenes. At L3 levels and above, lidar will become an indispensable enabler for safe autonomous driving. That's the judgment of Gunnar Juergens, vice president and head of the lidar division at Continental.

Just a few days ago, Volkswagen CEO Herbert Diess publicly stated that lidar (or rather, redundant perception) is necessary to achieve autonomy. At present, the company has carried out lidar cooperation with Mobileye.

Given that the automotive industry has been subject to constant safety regulation, "caution" is always better than "aggressive" for the automakers who are in the midst of it, at least in terms of features that have already been officially delivered to users.

A few years ago, Japanese auto parts supplier Takada paid a heavy price for "not facing up to product defects and concealing the facts for a long time." Previously, more than 100 million passenger cars worldwide used Takata airbags as a safety hazard, and Pang's massive recall led to the company's eventual filing for bankruptcy protection.

Read on