laitimes

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

With the rapid development of new energy vehicles, the accompanying automatic driving function has been blown into the sky by major automobile manufacturers, and has even become a must for some sales to persuade customers. However, the new risks brought about by new technologies cannot be ignored. Recently, the US media have reported a case, Tesla driver Kevin George Aziz Riad was charged with manslaughter for abusing the assisted driving system to cause death. This may be the first case in the world to be charged with a crime after the death of a car's self-driving function.

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

The incident occurred in suburban Los Angeles in December 2019, when riad's Tesla Model S sedan ran a red light while leaving the highway and crashed into a Honda Civic, killing two Civic passengers. According to the National Highway Traffic Safety Administration (NHTSA), the driver used the Tesla Autopilot (AP) function at the time of the accident.

The US media pointed out that when the accident occurred, the driver was not able to take over the steering wheel at the first time, which caused a tragedy. That is to say, the driver did not put his hands on the steering wheel at the time of the accident, resulting in the vehicle losing control.

Frequent accidents Tesla's automatic driving reliability is in doubt

In the field of new energy vehicles, Tesla's self-driving technology can be described as a pioneer in the industry, but the controversy that accompanies it has been endless. Tesla said in its vehicle safety report for the fourth quarter of 2021 that with autopilot and other safety features activated, Tesla's cars have an average of one accident recorded every 6.94 million kilometers, neither turning on Autopilot nor turning off active safety, with an average of one accident per 2.56 million kilometers. In the same period of data recorded by NHTSA in the United States, the average U.S. vehicle has one traffic accident per 779,000 kilometers. Therefore, Tesla believes that autonomous driving is 8.9 times more reliable than humans.

In fact, this comparison is more like Tesla's self-aggrandizement about itself, full of marketing flavor. Over the years, Tesla accidents due to autopilot features have been common in Gein.

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

In 2016, a driver driving a Tesla Model S in the United States died in a car accident on a national highway after turning on autopilot. Tesla said Autopilot failed to identify the left-turning truck ahead because it was bright and the truck was white.

In January 2016, a Tesla sedan had a rear-end accident in the Handan section of the Beijing-Hong Kong-Macao Expressway in Hebei Province, and the driver unfortunately died. The driver's family believes that this is the fault of Tesla's autopilot function, so they sued Tesla China Sales Company.

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

In April 2018, an Apple technical engineer used autopilot while driving a Tesla Model X, and the vehicle lost control and crashed into the barrier, resulting in the owner's eventual death due to the accident.

In May 2021, a Tesla driver in California, USA, turned on Autopilot and ran to the back seat of the car to ride, so he was arrested by the police.

According to relevant statistics, since 2016, NHTSA in the United States has investigated 26 Autopilot-related accidents, of which at least 11 people have died.

Look at it rationally New technologies should not be marketing tools

In the current automotive market, autonomous driving technology has become a hot technology in the market, all those who are not equipped with this technology, are basically considered to be old models, in the vast majority of new model launches, L2 level ADAS function (advanced driver assistance system) has been hyped, as one of the big selling points of the model.

However, the original intention of the technology is good, but the bad thing is that the overwhelming and even somewhat boastful publicity and promotion make drivers trust the automatic driving technology too much. In the current stage of autonomous driving technology, it should actually be called auxiliary driving function. However, tesla has publicly announced that its FSD upgrade package is a fully autonomous driving function, so that drivers will have a considerable chance of misunderstanding.

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

In many videos of Tesla owners, you can see the "brain-dead" operation of leaving the steering wheel with both hands, which is said to have been demonstrated by sales consultants during the test ride. At present, many domestic automobile manufacturers also claim that they have L2+ auxiliary driving capabilities. Excessive hyperbole, resulting in user expectations exceeding the extent that technology can do.

Why wouldn't we say there was a problem with assisted driving features such as cruise control before? Because people clearly say in the use of the norms, consumers must control the car themselves, and their hands cannot leave the steering wheel, so they will not misunderstand consumers. In the current car publicity, L2+ assisted driving technology seems to be omnipotent, and highways and urban roads are all-encompassing.

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

However, whether it is Tesla's pure visual recognition program or the lidar + visual recognition scheme more used by domestic automobile manufacturers, there is a problem that cannot be hidden, that is, the recognition success rate cannot be perfect, because the road conditions on the road are ever-changing, and there are always times when there are errors. This is due to both hardware and software reasons, as well as causes of accident probability. This is the same as our driver driving the vehicle by himself, except that when the driver controls the vehicle by himself, the accident can only blame himself for not grasping the road conditions, but when there is a problem with the automatic driving technology? So who is to blame?

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

Moreover, people often have a kind of inertia when driving, in the case of opening the automatic driving function, because the system takes over part of the operation, people will inevitably be distracted, resulting in inattention, it is difficult to respond to unexpected situations on the road in time.

The future can be expected Autonomous driving technology is not a flood beast

The application of every new technology is accompanied by various pains. From immaturity to maturity is the normal stage of development of things. However, we must not pull up seedlings and misestimate the application scenarios of new technologies. In the national standard "Classification of Automobile Driving Automation" issued by the State Administration for Market Regulation (standard number: GB/T 40429-2021), driving automation, systems and functions, minimum risk status and strategy, intervention requests and takeovers, system failure, etc. are strictly defined, and the automatic driving of automobiles is divided into L0-L5 levels.

American Tesla owner autopilot hit and killed two people and was sued, who is responsible for the autopilot accident?

As a result, the unclear standards of the automotive industry have been clearly defined and risen to the national standard, which will be officially implemented from March 1, 2022, and after the implementation, automakers can no longer advertise their technology under the banner. In the Road Traffic Safety Law (Revised Draft) issued by the Ministry of Public Security in April last year, the relevant management requirements for automatic driving have also been added. This fully shows that the relevant departments of the state have paid attention to the relevant problems of autonomous driving technology and formulated relevant regulations and standards for this purpose.

In general, automatic driving technology is not a flood beast, nor is it a panacea, we must rationally look at its merits and faults, enhance consumers' cognition and understanding of it, and put an end to excessive publicity of enterprises for it, so that the development of automatic driving technology is on the right track.

Read on