laitimes

Tesla vs Driver: Who is driving "autonomously"?

Tesla recently made big news: it was asked by the National Highway Traffic Safety Administration (NHTSA) to recall nearly 54,000 cars in the United States.

Why?

Again, that's the problem with its Full Self-Driving software: it has the potential to have cars move slowly in front of stop signs at intersections, rather than stop completely.

In the United States, all drivers know that once they see a stop sign at an intersection, whether they see other vehicles or pedestrians, the driver must stop the car completely before moving forward. So, this feature in Tesla's autopilot properly violates many state laws and may increase the risk of car accidents.

Why does Tesla claim to be fully intelligent autopilot, but in reality, there are so many bugs? When these bugs have serious consequences, is it the driver who bears the responsibility, or the driver who believes in the beautiful scene described by Tesla?

Driver or Car: Whose Fault Is It?

Last month, the world's first serious criminal prosecution involving Tesla Autopilot surfaced.

The accident in this case occurred in the 2019 Los Angeles suburb of Gardena.

Late at night on December 29, 2019, driver Kevin Riad drove a Tesla Model S and ran a red light at a high speed as he got off the ramp from the highway and crashed into a Honda sedan that was driving normally through the intersection. The accident resulted in the death of two people on the Honda vehicle, and Riad and another female passenger on the Tesla vehicle suffered minor injuries and are now recovering. Both cars are scrapped.

Afterwards, the U.S. Highway Traffic Safety Administration NHTSA announced the dispatch of a special investigation team. Prior to this, the special investigation team had investigated more than a dozen car accidents related to Tesla's Autopilot feature.

Last October, California prosecutors formally filed a felony charge against Riad, including two counts of vehicular manslaughter in a traffic accident. The first hearing is scheduled for February 23 this year.

It should be made clear that the defendant in this case was the driver of the vehicle that caused the accident at that time.

While Autopilot technology undoubtedly played a key role in the accident and will be highlighted during the trial of this case – the technology itself, and Tesla, is not a defendant in this case.

(However, U.S. federal and local regulators are also investigating Tesla over autopilot.) )

Tesla vs Driver: Who is driving "autonomously"?

Image source: local TV station KCAL-9

The case is also the first known felony case in the United States involving the Autopilot feature and prosecuting the driver who caused the accident. It also means that even if the driver thinks the car is "driving on its own," the driver himself is still responsible for the accident.

A number of experts in legal affairs related to autonomous driving, including Briant W. Smith, a law professor at the University of South Carolina, and Donald Slavik, a lawyer, said that the California public prosecution of Riad is the first serious criminal prosecution involving the death of related auxiliary/autonomous driving functions.

In addition, the family of the victim of the car accident has also filed a civil lawsuit against the driver and Tesla Company. In the indictment, the family of victim Lopez claimed that the Model S "suddenly sped up to an uncontrollable speed" at the time of the accident and accused the perpetrator Riad himself of having a very unclean driving record and being a "dangerous driver." These civil indictments will be heard separately and are not related to this California public prosecution for the time being.

Autonomous driving or assisted driving?

If you know enough about electric vehicles themselves and the associated technology industry, you probably know that the feature, which Tesla has named "Autopilot," isn't really an "autopilot" system.

In essence, Autopilot is an "upgraded version" of the auxiliary cruise function that many cars have, which can achieve fixed speed cruise, follow lane, adjust speed and other operations. The feature comes standard from models after April 2019 at no extra cost, and models from 2016-2019 include the hardware required for Autopilot and require a fee to activate.

According to the industry's widely accepted SAE International standards, Autopilot is in the range of L2-L3 and does not reach the accepted level of "autonomous driving".

For example, an important indicator of the SAE standard (see figure below) for L3 is that the driver can not pay attention to the road conditions in certain situations; Tesla will also tell the owner when the function description and teaching, even if the Autopilot function is started, the driver still needs to keep his attention.

Tesla vs Driver: Who is driving "autonomously"?

Autopilot also has an FSD mode that automatically changes lanes, automatically adjusts the speed of the car according to the surrounding traffic conditions, changes lanes, drives in and out of the ramp, automatically parks, operates according to signal lights/parking signs, and more.

The full name for FSD is Full Self-Driving, and most of the features have been commercialized; a few features are still in beta, but they are also open to consumers of commercially available models, as long as they pay to upgrade the FSD.

Tesla vs Driver: Who is driving "autonomously"?

However, regulators do not seem to agree with Tesla's claims and practices.

In California v. Riad, NHTSA issued a direct statement stating:

1) Although many vehicles have an assisted driving function, in law, only the driver himself is the responsible subject.

2) All cars, regardless of whether there are partial self-driving functions, whether these functions are turned on or not, require the driver to maintain full control. (every vehicle requires the human driver to be in control at all times.)

3) No commercially available car can achieve autonomous driving. (no vehicle on sale that can drive itself.)

From NHTSA's statement, we can be clear: as of now, for traffic accident cases involving Autopilot and similar driver assistance functions, these functions are indeed the direct cause of accidents. However, the thinking of the regulatory authorities is still to take the driver who caused the accident as the main body of responsibility and the object of punishment.

However, in the third article mentioned above, considering that Tesla and its executives have seriously exaggerated the actual capabilities of Autopilot and the advanced nature of Tesla's self-driving technology on many occasions in the past, NHTSA's statement also directly refutes their caliber.

On May 7, 2016, in Ohio, USA, a Model S with Autopilot enabled crashed into a truck. The driver died instantly, and Autopilot took his first life in the United States. At the time, the feature had only been online for half a year.

According to Silicon Star, since the release of the feature, it is known worldwide that traffic accidents related to the Autopilot feature have killed at least 6 people and seriously injured many people.

The National Transportation Safety Board, the NTSB, a government-owned civil traffic accident investigation agency, has launched special investigations into at least 28 Tesla electric vehicle casualties. These accidents involved a number of factors, including Autopilot, battery fires, brake failures, and more.

Last year, NHTSA issued a government briefing announcing a more detailed investigation into 11 specific accidents involving Autopilot, Tesla and its ability to design Features such as Autopilot and Transit Awareness Cruise System (TACC).

Tesla vs Driver: Who is driving "autonomously"?

In Europe, Tesla's over-hype has been taken seriously by the government. In Germany, for example, anti-unfair competition regulators sued Tesla in 2020, accusing Autopilot and FSD of misleading the public by naming and promoting them, and that road-running vehicles turned on these functions in violation of traffic safety regulations.

In the case, the court ruled that Tesla lost the case. Since then, Tesla has been banned from using words such as Autopilot and FSD in Germany, and consumers must not continue to be persuaded of its ability to "autopilot" in marketing campaigns.

Cool or damned?

Many readers of Silicon Star people should remember that in 2018, on the Mountain View section of Highway 101, Chinese owner Walter Huang drove a Model X, and suddenly hit the concrete barrier during normal forward driving, unfortunately died.

At that time, the Autopilot function was turned on on the car, but the many powerful sensors and advanced road scanning software algorithms equipped with this high-end Model X failed to detect the narrowing of the road ahead in time, resulting in an accident.

Tesla vs Driver: Who is driving "autonomously"?

Related surveys show that the owner has complained several times to his wife, family and other Model X owners about the Unusable Autopilot function. Specifically, Autopilot has turned his car several times onto the barrier where he later died.

Huang worked as an engineer at Apple before he was born, but even he did not have a clear enough understanding of the limitations of Autopilot's functions, let alone ordinary car owners who are not tech practitioners who are not familiar with electric vehicles and assisted/autonomous driving technology.

Many car owners choose Tesla, which is its high-tech attributes, giving people a very "cool" feeling. Autopilot, on the other hand, seems to be the long-awaited application of autonomous driving technology, the earliest and most high-profile application in large-scale commercially available models.

A previous article by Silicon Star people pointed out that the problem now is that some electric vehicle/new car brands, including Tesla, over-publicize the relevant auxiliary driving functions, exaggerate their capabilities, cause misunderstanding in the minds of users, and indirectly lead to more frequent accidents.

As a result, some assisted driving technologies that do not meet the standards of automatic driving, packaged under the mask of automatic driving, have not made road traffic safer, but have caused more accidents.

Matthew Avery, a member of the European Union New Car Safety Evaluation Association and a British auto safety expert, said in an interview with local media that Tesla's promotion of Autopilot, FSD and other functions is extremely misleading, and it is easy for consumers who are not familiar with technology to mistakenly think that after they spend money to upgrade Autopilot/FSD, their cars will have the ability to fully autonomously drive.

"Many of the accidents caused by Autopilot are fatal. Whether these drivers are joking, or because of worse situations — they really believe Autopilot is fully autonomous — we don't know. ”

Last year, Tesla pushed the FSD beta 9 beta software to some vehicles, allowing some owners to provide Tesla with road test data.

But some owners soon discovered that this version of the FSD was very unstable and unsafe, using the moon as a signal light and sneaking into the bike lane from time to time. If these cars are really allowed to come on the road, they will undoubtedly become a serious road safety hazard.

Consumer Reports, a consumer rights group, released an investigation report claiming that the FSD test software pushed by Tesla lacked security protections and posed a serious risk to other road participants.

Tesla vs Driver: Who is driving "autonomously"?

FSD Beta 9 mistakes the moon for a signal light Screenshot source: Jordan Nelson

Recently, a social welfare organization called The Dawn Project published a full-page advertisement in the New York Times, titled "Don't become a Tesla crash test dummy", accusing the FSD function of technical reliability and safety is very bad, Tesla provides the function for commercially available models, so that the owner becomes a road test "guinea pig" The practice is to kill people.

"A Fortune 500 company that sells thousands of cars, running the worst software, is driving on open roads, and our family will be dummies for crash testing of these cars — that's not what we want," the ad reads.

Tesla vs Driver: Who is driving "autonomously"?

After selling for so many years, the popularity of Tesla electric vehicles is now very high, and its unique sense, sense of technology, and the "cool" label added to the identity of the owner are not as good as before.

What we observe today is that Autopilot, FSD, etc. are actually assisted driving, but they are advertised as autopilot functions, which is continuing the life of Tesla as a "cool" brand.

However, letting car owners have the illusion of these functions, which in turn leads to more traffic accidents, loss of life and property – this is not cool at all.

Read on