Technology is innocent! This is the previous statement of the owner of a domestic Internet startup company, in his words, the production of any technology is not with any value attributes, especially the technology composed of code and programs, only because the people who use them have given them attributes and labels that meet their needs, so that these technologies have "good and evil"! I think these few words are placed in the field of automotive autonomous driving that has just emerged, and the same is true, automatic driving, or strictly speaking, high-level driver assistance systems, while pointing out the direction for the future development of automobiles, in its "growth" stage, in fact, from time to time is still "doing evil"!

In 2019, there was a case in the United States in which the Tesla Model S rear-ended honda Civic caused the death of two people in the Civic car, and in October 2021, the relevant parties filed a complaint against the Tesla Model S driver that the driver may face a felony! In the investigation of this case, a very attractive point is that before the accident, the Model S was turned on Autopilot, which is Tesla's autopilot function. Many people do not agree that the reason for the felony charge against the owner of the Model S is that since the vehicle is an accident that occurred in the mode of automatic driving, then the main body responsible for the accident should be the Model S, not the driver, and by extension, Tesla itself, because there was an accident, at least it proved that the function of Autopilot was defective, otherwise how could there be a rear-end accident that caused casualties?
Of course, in China, there have also been some cases of accidents caused by the opening of high-level driving assistance configuration, on January 12, a four-car rear-end collision occurred on the Shanghai Inner Ring Elevated Highway, and a Model 3 that opened the auxiliary driving directly "shoveled" a BYD Song Pro DM, and the chassis of the front car was clearly visible!
There have also been cases of netizens posting videos of opening driving assistance and disengaging their hands from the steering wheel on the Internet, but they were held accountable by the traffic management department "after the fact". These situations actually show a problem, that is, whether in China or abroad, although because of the opening of automatic driving or high-level driving assistance caused by accidents, many people will ask manufacturers to say, but from the result point of view, it is generally believed that after the accident, the responsible person is the driver of the vehicle, not the vehicle itself, of course, the manufacturer should not be held responsible, that is another level of the problem.
In fact, at present, whether it is domestic or overseas, there is no real sense of automatic driving, all the so-called automatic driving, is the manufacturer's "edge ball" style of propaganda is trusted by some Chinese people trust the manufacturer's consumers. For many car owners, they believe that the vehicle they are driving has a certain level of driving assistance capabilities, and they hope to achieve automatic driving. However, for many manufacturers, in fact, when promoting related technologies, they have done a good job of circumventing the "firewall" of responsibility, and so far we have not seen a successful case of claims from manufacturers after the owner opens automatic driving or high-level driving assistance causes an accident. Therefore, at present, any result caused by believing in automatic driving is the responsibility of the driver himself!
From a technical point of view, if automatic driving is in a relatively ideal environment, then the current L3 and L4 level driver assistance configurations can fully achieve automatic driving. However, the so-called ideal environment basically only exists at the hardware level, such as high-precision map support, very responsive radar systems, well-planned /clearly marked lane lines, etc., but the only thing that cannot be idealized is the subjective arbitrariness of people. Because when only a small proportion of models have automatic driving or high-level driving assistance, then the surrounding environment faced by these models themselves is very dangerous, because other vehicles are operated by "people", the quality of technology is not to say, just the subjective arbitrariness of these drivers, it is fully capable of successfully challenging the judgment and decision-making ability of these autonomous models. Therefore, in the stage of the integration of highly intelligent cars and traditional cars, the driver's handing over the steering wheel to automatic driving or assistance systems is itself irresponsible to other traffic participants and themselves.
So automatic driving is still in a preliminary stage at the moment, or still in the "juvenile" stage, we are referring to all the models that claim to have automatic driving, so at this stage, all autonomously conscious drivers, because they believe in the so-called automatic driving and caused by the accident, can only be responsible for themselves!
Of course, everyone may have another idea, that is, the owners of different knowledge and different understandings of automatic driving are different, or that cautious people choose not to trust the system, and the "big heart" driver chooses to rely fully on the system, how to define and balance this cognitive difference? In such a relatively "vague" period, that is, the rules lag behind the technology, the driver still has to become the dominant driver, because there are too many uncontrollable factors, the vehicle's system cannot be comprehensive, any calculation-based technology can not be called intelligence, only people have a real sense of intelligence, we say an extreme situation, assuming that the vehicle's driving assistance system is particularly good, but the road surface is slippery, the tire grip becomes weaker, the vehicle system detects the front car, and also starts the active braking, But I can't stop it at all, how does this count? Can it still be said that it is a problem of manufacturers and technology?
So in essence, at least at the moment, any accident caused by automatic driving and high-level driving assistance is still the responsibility of the driver himself, and from the level of rules, it is not supported to leave the driver's automatic driving. It may be that in the future, when the vast majority of vehicles have high-level driving assistance or automatic driving capabilities, manufacturers and "technology" can be held responsible for accidents!