laitimes

Should the car company be responsible for the "pilot assisted driving" accident?

Should the car company be responsible for the "pilot assisted driving" accident?

Please cherish the owners who are willing to test smart driving with money, even personal safety.

Author | Xiao Ying

"Pilot assisted driving" is synonymous with the highest level of intelligence in mass production vehicles.

From Tesla to Wei Xiaoli to Volkswagen models such as the Great Wall and GAC, this function has become the core selling point of car company marketing.

In addition to the self-developed layout of car companies, recently, a number of suppliers have announced that they can help car companies achieve this function, and it is expected that "pilot assisted driving" will enter the large-scale stage of getting on the car from this year.

At the same time, the scene of the landing of "pilot assisted driving" will also be expanded from high-speed and expressway to urban transportation.

But before hyping up this feature, car companies should probably ask themselves three questions:

Is your "Pilot Assisted Driving" mature enough? Behind the accidents caused by this function, why is there a series of Rashomon incidents? Are car companies really not responsible for such accidents?

01

Fuzzy positioning of "Pilot Assisted Driving"

The name "Navigated assisted driving" comes from the Tesla NOA (Navigated on Autopilot).

At present, the car companies that have the ability to land high-speed "pilot assisted driving" are mainly Tesla, Wei Xiaoli and Great Wall Motors. Let's review the time node when several car companies launched the "Pilot Assisted Driving" function:

● In June 2019, Tesla pushed NOA (Navigate on Autopilot) to domestic car owners who have opted for FSD hardware.

● In September 2020, NIO released NOP (Navigate on Pilot) at the Beijing Auto Show, which was officially launched in October 2020.

● In January 2021, the NGP (Navigation Guided Pilot) public beta version of Xiaopeng Automobile was opened to users.

● In November 2021, Great Wall Motors officially pushed the Navigation on HPilot function to Weipai Mocha users.

● In December 2021, Ideal Auto pushed the NOA pilot assist function to users through OTA upgrades.

Should the car company be responsible for the "pilot assisted driving" accident?

In addition, GAC Aean, Huawei, Horizon, Intelligent Driving Technology and other companies have also announced that they already have the "Pilot Assisted Driving" solution, which is expected to be mass-produced within the year.

Not only that, it is foreseeable that models such as zhiji L7, Jidu, Nezha S, Lotus Eletre, Salon Mecha Dragon, Avita 11, BMW iX, Mercedes-Benz S-class and other models that have introduced lidar in sensor cases will also launch the "pilot assisted driving" function after mass production.

It can be seen that the "pilot assisted driving" function is not only the unique selling point of several new car-making forces, but will become the mainstream configuration, which will soon be landed on a large scale.

Why does "pilot assisted driving" become a selling point for car companies to show off their intelligent capabilities? What are the functional differences between it and L2 assisted driving?

If you look at the classification of automatic driving, the industry generally believes that "pilot assisted driving" is a function that is between L2 and L3.

This vague positioning is very interesting, which can be said to not only meet the demands of car companies to promote intelligent capabilities, but also help car companies reasonably avoid regulatory risks.

We know that there are three core functions of L2 intelligent driving: adaptive cruise system (ACC), lane keeping system (LCC), and automatic emergency braking (AEB).

On the basis of L2, L3 adds the ability to autonomously parallel overtake and automatically avoid obstacles. This means that under specific road conditions, the driver can take off his hands and feet, and the vehicle handles all the problems in the driving process by itself.

That is, L2 requires the driver to remain in a takeover state at all times, while L3 allows the driver to do something else in a specific scenario. At the same time, this also brings about the fundamental difference between the two in the determination of accident liability, the accident that opens the L2 intelligent driving is the responsibility of the driver, and the accident caused by the L3 is the responsibility of the vehicle.

Should the car company be responsible for the "pilot assisted driving" accident?

National Recommended Standard for Automobile Driving Automation Classification (GB/T 40429-2021)

Therefore, car companies are often willing to describe the "pilot assisted driving" as a function that is infinitely close to L3, and reflect their strong intelligent ability by emphasizing the low frequency of takeovers and the high success rate of lane change overtaking.

For example, Xiaopeng Motors announced such a set of data in a 3,000-kilometer NGP challenge:

"The average number of takeovers per 100 kilometers is 0.71, the success rate of lane change overtaking is 94.41%, the success rate of ramp passing is 92.76%, and the success rate of tunnel passing is 94.95%."

This set of data gives people the feeling that Xiaopeng's high-speed NGP is safe in 99% of the cases and does not need to be taken over.

At the same time, since the function is not directly positioned as L3, the car company has also found a good reason for the exemption from liability for the "pilot assisted driving" accident.

This has a very strange phenomenon, on the one hand, the car company publicizes a strong intelligent ability, which can greatly improve driving comfort and safety, on the other hand, it cannot guarantee that there will be no accidents, and once an accident occurs, the driver will be responsible.

Of course, car companies will not be so straightforward when advertising, but often strengthen the impression of safety and reliability of "pilot assisted driving" and weaken the probability and risk of accidents.

This leads to an inevitable problem: the louder the intelligent slogan, the higher the consumer trust of the car companies, the more accidents.

02

The inexplicable Rashomon case

For so many years, the most traffic accidents caused by the untimely takeover of intelligent driving are Tesla, followed by Wei Xiaoli.

Tesla's first intelligent driving fatal accident occurred in January 2016, in the Handan section of the Beijing-Hong Kong-Macao Expressway in Hebei Province, a Tesla Model S directly hit a road sweeper that was working, and the driver unfortunately died.

The most recent one occurred in May 2021, in Fontana, California, where a Tesla Model 3 crashed into a truck that had capsized, killing tesla owners.

During this five-year period, small accidents are not counted, and at least 10 victims have died using Tesla's "pilot assisted driving".

The characteristics that cause these accidents are almost identical, and the stationary obstacles on the highway section include: overturned vehicles, working construction vehicles, stationary large trucks, etc.

The main reason for the accident is that the "pilot assisted driving" is only an upgrade on the basis of ACC and LCC, and the perception scheme mainly relies on vision + millimeter wave radar. Among them, the camera has the problem of missing detection, and the millimeter wave radar has defects in the detection of stationary objects, which leads to the problem that some scenes cannot be identified, which in turn leads to traffic accidents.

A similar accident occurred at NIO.

In August 2021, Mr. Lin, the owner of the NIO ES8, crashed into a road construction vehicle in the Hanjiang section of the Shenhai Expressway after enabling the NOP function, and unfortunately passed away.

Should the car company be responsible for the "pilot assisted driving" accident?

NIO ES8

Ideal Car and Xiaopeng Automobile also had an accident because of the pilot assisted driving, but fortunately there were no casualties.

In September 2020, an ideal ONE owner in Qingdao, Shandong Province, drove with his family on the inner side road of the G18 highway, and a van in front of the right hit the left turn light to change lanes, trying to enter the inner side road, ideal ONE did not slow down, but directly hit it.

The most recent car company that has attracted attention to assisted driving accidents is Xiaopeng Motors.

In mid-April, Xiaopeng Automobile was on the hot search because of a breaking news. Mr. Deng, the owner of xiaopeng P7, had a serious car accident on the Linxiang-Yueyang expressway section of Hunan Province because he turned on intelligent assisted driving.

According to the feedback of the owner, Xiaopeng P7 fixed speed 80km/h directly hit the static vehicle that overturned in front of it, and there was no sign of deceleration, collision warning prompt or emergency braking before the impact, and when the owner reacted to intervene, it was too late to brake.

After the accident fermented, Xiaopeng Automobile responded quickly, saying that "the preliminary judgment is that the owner did not maintain the observation of the environment in front of the vehicle and take over the vehicle in time during the process of making ACC +LCC (Adaptive Cruise & Lane Centering Function)." ”

From these two sentences, we can feel that Xiaopeng's response is very reasonable. The reason why Xiaopeng Automobile can be so righteous that it is the driver's problem is the reason we analyzed above, no matter how strong the Xiaopeng P7 will be intelligent in product promotion, as long as it is added, "The vehicle has not yet achieved full automatic driving, and the driver needs to keep taking over at any time", it can turn the danger into a disaster and exempt all legal responsibilities.

Should the car company be responsible for the "pilot assisted driving" accident?

Similar to the attitude of Xiaopeng Automobile in handling such accidents, other car companies often feel sympathy for the unfortunate experience of the owner, but resolutely do not believe that they should be responsible for the accident.

Read on