laitimes

The "take off the hand off the eye" takeover test is shocking! Visual DMS "wins" steering wheel sensing

With the increase in the rate of ADAS functions, the safety assessment of the system has also become the focus of attention of regulators in various countries. Among them, the biggest challenge is driver monitoring.

Earlier this year, the National Association for Highway Safety and Insurance (IIHS) officially released a new rating system for assisted driving systems, which clearly requires new cars equipped with related systems to take safety measures to help drivers stay focused.

The new rating will be set to four gears, with the highest rating requiring the system to monitor whether drivers are putting their hands on the steering wheel and paying attention to the road ahead. In addition, drivers are required to initiate automatic lane changes.

All new cars offering Lane Keeping Assist (LKAS) must be equipped with steering wheel off-hand detection (HoD) under UN Regulation 79, which the European Union has also adopted for new cars produced from 1 April 2021.

There are currently many ways to implement HoD, such as capacitive sensing, camera monitoring, pressure sensing, etc. At present, in the Chinese market, the new capacitive steering wheel (to achieve off-hand detection) is only equipped with a few German joint venture brands; some smart models, including Tesla, are equipped with a pressure-sensitive steering wheel.

According to a recent actual model test data released by the American Automobile Association (AAA), the actual effect of visual monitoring in the DMS cabin is several orders of magnitude higher than the effectiveness of indirect monitoring methods such as torque sensors. For example, cadillac with infrared camera DMS has an average distraction time of 7.7 seconds, compared to 37.7 seconds for Tesla (the in-cabin camera is not activated).

The test results highlight the greater potential safety risks against the backdrop of the gradual normalization of "off-hand" automatic assisted driving.

According to the monitoring data of The Intelligent Vehicle Research Institute of Gaogong, in 2021, the number of new cars (excluding imports and exports) in the Chinese market in 2021 is equipped with L2-level ADAS functions as standard, but the proportion of vision monitoring systems in dms cabins is less than 0.5%.

One

According to the assessment of relevant agencies, the driver assistance system is falling into a "safety" crisis, due to the aggressive marketing strategies of more and more car companies, resulting in some consumers overestimating the capabilities of the system. From a safety perspective, L2 level assisted driving systems require continuous driver monitoring.

In the AAA's actual vehicle test, the driver monitoring system is divided into direct monitoring and indirect monitoring. The former mainly integrates a camera facing the driver to detect if the driver is distracted or in a state of fatigue (which can also lead to an increase in the takeover time). Indirect systems, on the other hand, use only steering wheel inputs (including capacitive, conventional pressure sensing) to detect driver distractions or disengagements.

The test is mainly divided into two scenarios: A (the driver's head is normally forward, the line of sight is down, and the hand leaves the steering wheel) / B (the driver's head is turned to the side, the hand leaves the steering wheel), and the effect of the driver monitoring system in daytime and night lighting conditions is evaluated.

The four models tested are the 2021 Cadillac Escalade, the 2021 Subaru Forester, the 2021 Hyundai Santa Fe and the Tesla Model 3. Among them, the first two models are equipped with camera monitoring systems. The test speed was approximately 63 mph, based on the system's audible, visual, or tactile alerts.

The "take off the hand off the eye" takeover test is shocking! Visual DMS "wins" steering wheel sensing

In all lighting conditions, direct monitoring systems are significantly more effective than indirect systems. In scenario A, the direct monitoring system issues an alarm on average 50 seconds earlier than the indirect system under both lighting conditions. The test results for Scenario B are also essentially at the same level. It can be seen that lighting conditions are not an important factor in evaluating driver monitoring systems based on vision technology.

The "take off the hand off the eye" takeover test is shocking! Visual DMS "wins" steering wheel sensing

The reason is that the current mainstream vision DMS system is equipped with integrated infrared sensing, and the impact of low ambient lighting conditions on system performance is minimized. In nighttime conditions, reduced road visibility and a higher probability of fatigue driving will increase the takeover reaction time.

However, under better light conditions during the day, due to the difference in the angle of sunlight, the vision-based DMS system shows a certain sensitivity to the direct lighting of the test driver's face. In the test data, the alert occurrence time increases slightly.

Regardless of the disengagement mode or lighting conditions, the evaluation results of the direct driving monitoring system are significantly better than the indirect monitoring system. But from a safety point of view, even in a direct driving surveillance system, the driver's takeover time is still around 5 to 8 seconds, which is still too long; the optimistic situation is that because the alarm is a continuous output, it can prevent the driver from disengaging for a long time.

To test whether the vulnerabilities in the driver monitoring system can be deliberately circumvented under the operating conditions of the driver assistance system, the researchers tried to test it with periodic head/eye movements or steering wheel detachment/takeover, but the test report did not disclose which circumvention strategy the testers used.

In this scenario, the average time for the system to issue an alarm increases significantly. At the same time, the average time for indirect monitoring systems to issue alarms is significantly higher compared to direct driving monitoring systems.

The "take off the hand off the eye" takeover test is shocking! Visual DMS "wins" steering wheel sensing
The "take off the hand off the eye" takeover test is shocking! Visual DMS "wins" steering wheel sensing

The data shows that the direct monitoring system allows the driver 135 seconds to get out of the road before the alarm is issued. In theory, this would allow the vehicle to travel more than 2 miles on the highway. Indirect surveillance systems are longer, reaching 6 miles. This means that the indirect monitoring systems that participated in the test were even less effective at reducing the abuse of assisted driving systems.

Currently, this takeover time (the above 135 seconds is only the time the alert is issued) means that the systemic risk has risen sharply. As described in mercedes-Benz's latest generation Drive Pilot (L3 level) product manual, the driver must be ready to control the driving at all times and be able to continue driving manually within 10 seconds.

Two

According to the relevant policies on intelligent and connected vehicle manufacturers and product access management previously issued by China, it is clearly proposed to strengthen the safety management of autonomous driving functional products. For example, it should be possible to automatically identify the failure of the autonomous driving system and whether the design operating conditions are continuously met, and to take risk mitigation measures to achieve a minimum risk state.

Among them, the function of human-computer interaction has become the key, and if the driver is required to perform dynamic driving tasks under specific conditions, it should have the function of identifying the driver's ability to perform dynamic driving tasks, so as to ensure that the driver is always performing the corresponding dynamic driving tasks.

Obviously, from the current market trend and regulatory policy evolution, whether it is the new car regulations or ratings, the functional configuration of human-computer interaction and monitoring in the cabin has just been clarified.

In Continental's view, in-cabin perception technology is playing another important role on the road to autonomous driving. Among them, the most critical link is human-computer interaction, and safe takeover is a key task that the automatic driving system must master.

Last year, for the first time, the company integrated the camera directly into the display (a concept similar to the off-screen camera of a mobile phone), rather than the camera currently on the car integrating into the steering wheel, gauges, rearview mirrors or the position of the A-pillar.

Tesla has been pre-installing in-cabin cameras for drivers and rear passengers (located in the inner mirror position) since the Model 3, but the company's owner's manual says these cameras are not used to monitor drivers.

In the context of continuous accidents, regulators have begun to release pressure on Tesla to increase strong driver monitoring mechanisms, such as vision-based monitoring systems that some car companies have already enabled.

In addition, with the implementation of new regulations in Europe's NCAP, vision-based driver monitoring will become a necessary condition for new cars to receive a five-star rating. According to the plan, from June 2022, all new cars entering the EU market will have to be equipped with similar safety systems, including driver sleepiness and attention warnings.

At present, the new car is equipped with a visual DMS system, which is mainly divided into two categories: one is the interactive experience upgrade of the cockpit, based on the extension of faceID function, instead of the traditional non-visual monitoring system; the other is the standard for high-end automatic driving assistance systems such as L2+/L3, but the proportion of the latter is low.

The Gaogong Intelligent Vehicle Research Institute expects that the next three years will be the first peak period for mass production of new vehicle visual DMS front-loading in China, especially involving automatic assisted driver human-computer interaction; it is expected that the number of passenger car front-mounted visual DMSs will exceed 20 million in the next five years.

DMS ensures that the correct driver status is detected on the one hand, and thresholds the activation and failure of ADAS functions on the other. The vehicle R&D department is also transferring the DMS function planning and development from the traditional cockpit HMI field to the intelligent driving department.

Horizon is the first enterprise in China to propose a combination of soft and hard, and the artificial intelligence algorithm is built into the chip, and has launched the Journey 2, Journey 3, and Journey 5 chips for the market, and has formed a complete product layout of "automatic driving + intelligent cockpit" and the linkage between inside and outside the car.

At GAC, the above-mentioned multimodal human-computer interaction technology combined with the ADiGO 4.0 intelligent driving interconnection ecosystem has been named "Super Sense Interactive Intelligent Cockpit", and the partners behind it include Huawei, Tencent, iFLYTEK, Horizon, Suzhou Zhihua, SenseTime, Desay SV and Magneti Marelli.

Up to now, many brand models including Changan, Great Wall, GAC, JAC, and Ideal have been equipped with DMS human-computer interaction functions created by Horizon based on Journey 2, and the total market share also ranks first in the Chinese market. At the same time, some car companies and Tier1 multi-modal interaction products based on horizon journey series chips are also being mass-produced and delivered.

In the view of The High Engineering Intelligent Automobile Research Institute, as China's market supervision is gradually tightening, the standard functions of the front installation, including visual DMS, a new capacitive steering wheel (to achieve off-hand detection), etc., are also the trend of the times.

As the integration of ADAS system and human-computer interaction enters the "deep water area", the threshold of the driver monitoring system will be further raised, and the low-cost scheme based only on face recognition (simple common patterns, such as head movement, yawning, and eyelid closure) will be "eliminated". More precise technologies such as eye tracking (gaze direction), facial expressions, and mood monitoring will become mainstream trends.

Read on