laitimes

How long does the "meal replacement" of AR HUD have to be "eaten"?

How long does the "meal replacement" of AR HUD have to be "eaten"?

Whether it is to add animation to the WHUD, or to use the different pictures of the left and right eyes to deceive the brain and create a three-dimensional effect, it is always a stopgap measure when the AR HUD technology is immature at this stage, and they are more like the "meal replacement" before the AR HUD matures...

When it comes to the HUD in the car, we can always hear different sounds: people who love it think it can bring a fighter-like driving immersion, people who hate it hate it; some people prefer the sense of technology it creates, some people think that a lot of information in front of the windshield affects the driving line of sight; some people hate that it is not clear enough during the day, and some people hate that its colors at night are too dazzling...

Speaking of these "grooves" of the HUD, we have to cry out for it.

Although this technology from the fighter jet was introduced into the car in 1988, from the CHUD that needed to use transparent Plexiglass plates in the early days to the WHUD that now uses a windshield to make a screen, to the AR HUD with better imaging effects, the time that really makes huD evolve is the time when smart cars are on the road.

How long does the "meal replacement" of AR HUD have to be "eaten"?

Therefore, although the HUD is controversial, it does not affect the speed of its popularity in smart cars: the HUD that used to be selected on high-end cars is now gradually moving into the low-end model, and even in the recently released ideal L9, the HUD directly replaced the dashboard and became the first choice for information display. Just as the HUD is soaring all the way to the "sinking market", the effect is also shotgun for gun, the image is getting bigger and bigger, the imaging effect is better, and more auxiliary driving information is displayed... But for the leading players in this field, AR HUD is definitely a common goal.

In the past, CES has always been a barometer of avant-garde technology, allowing us to understand the latest progress of AR HUD, but at the moment when the auto show has to be postponed, it is also more difficult to get in touch with the latest progress of HUD players, and more is to peek at one or two from the mass production cars that have landed.

How long does the "meal replacement" of AR HUD have to be "eaten"?

To this end, we specifically contacted GeekCar's old friend, FUTURUS Future Black Technology in Beijing. As one of the pioneers in the domestic HUD field, FUTURUS began to formally cooperate with BMW in 2018, and is also the core HUD supplier of a new domestic car manufacturing force.

In the process of chatting with FUTURUS, we gradually became familiar with the "AR HUD" that we had experienced countless times in the past year, and began to have a new understanding from the strange.

Is your AR HUD really "AR"?

In the past 2021, many people have called it the "first year of AR HUD".

What is an AR HUD? As the predecessor of AR HUD, WHUD projects the HUD virtual image to somewhere in front of the windshield through the front windshield as a screen, so that when the driver drives, he can look straight ahead while driving while looking at the speed, fuel consumption and other instrument information, and even navigation and auxiliary driving information, without frequently looking down or turning his head to view the instrument and the central control screen. AR HUD is to "AR" these information on the basis of WHUD, so that huD virtual image is integrated with the road reality scene.

(Evolution of effects from C HUD, W HUD to AR HUD)

For a virtual HUD image to blend with a real-world scene, you need depth of the image: when you mark a vehicle 5 meters ahead, the image looks 5 meters away, and when you mark a lane line 10 meters ahead, the image creates an effect 10 meters away. Being able to present a three-dimensional virtual image, adapting to the continuous zoom process of the human eye, and presenting the visual effect of virtual and real fusion can be said to be a necessary condition for an AR HUD.

In 2021, many models have launched "AR HUD" functions, such as Mercedes-Benz S-Class, Volkswagen ID. Series, Hongqi E-HS9, Great Wall WEY Mocha, etc., it seems that the market is hot. These products, which debuted under the name of "AR HUD", have indeed made improvements in visual effects, such as larger and clearer images, longer imaging distances, and some even have a sense of dynamics, but the actual experience is not satisfactory.

How long does the "meal replacement" of AR HUD have to be "eaten"?

(Volkswagen ID. Series AR HUD Function Renderings)

It's not that users are too picky, it's that people's eyes are too smart.

These "AR HUD" products use close-range single-layer or double-layer 2D displays. For the user, whether it is 2D or 3D does not seem to matter, as long as it is easy to use. But the point is, they don't work well.

This is like the naked-eye 3D diagram that was often played in the student era, and the current "AR HUD" mainly uses the left and right eyes to see different pictures to deceive the brain, and finally presents a three-dimensional effect in the mind.

How long does the "meal replacement" of AR HUD have to be "eaten"?

(Let me see the naked-eye 3D image of "Opposite Eyes")

However, not everyone can easily see the naked-eye 3D diagram, and the difference in the human eye also determines that not everyone will be "deceived" to see the stereoscopic effect. Combined with the shaking of the vehicle, you may get a "AR HUD" that is a bad experience and dizzying.

Even if the problem of vertigo is solved, how to make the two-dimensional image and the three-dimensional world perfectly integrate is still an insoluble threshold problem.

For example, when the AR HUD is combined with the ACC function, the position of the front car needs to be tracked, the vehicle position is pulled away, and the image marking the position of the front car should be pulled away accordingly. In order to create a dynamic effect of "simulated tracking", some "AR HUDs" use animation.

On the AR HUD of the Mercedes-Benz S-Class, 2D animation is used to create a dynamic effect of guiding arrows, making the picture look full of three-dimensional feeling. However, although animation can be simulated, the fusion with the environment cannot be achieved through animation, so the Mercedes-Benz S-Class simply gave up the intention of merging the virtual image with the environment, leaving the virtual information of the AR HUD "alone" floating in front of the windshield.

How long does the "meal replacement" of AR HUD have to be "eaten"?
How long does the "meal replacement" of AR HUD have to be "eaten"?

(@GeekCar Little Fat Man POV Perspective Mercedes-Benz S-Class AR HUD)

In addition, another drawback of the current "AR HUD" is to increase the field of view and imaging size on the basis of the existing flat display, and the package size is too large. Let's still take the example of the Mercedes-Benz S-Class, which is the best relatively effective at present, and it is equipped with an "AR HUD" box size of 27L. This size and the cost behind it have deterred many affordable models.

Therefore, whether it is to add animation to WHOD, or to use the different pictures of the left and right eyes to deceive the brain and create a three-dimensional effect, it is always a stopgap measure when the AR HUD technology is not mature at this stage, and they are more like "meal replacements" before the AR HUD matures:

On the one hand, the use of these technologies to make consumers familiar with and accept the concept of AR HUD, on the other hand, to step up the research and development of "true AR HUD", so that virtual images 3D and reality integration, the image is no longer "floating" in front of the vehicle, but "long" in front of the object that needs to be tracked and recognized, naturally can avoid visual fatigue, vertigo, affecting the drawbacks of driving.

What "slowed down" ar HUD?

If the 30-odd years since hudder boarding can be counted as "historic", the concept of AR HUD is not new. As early as the 2016 BMW NEXT VISON 100 concept car, we can see the prototype of the AR HUD concept.

How long does the "meal replacement" of AR HUD have to be "eaten"?

(Renderings of AR HUD on the BMW NEXT VISON 100 concept car)

A few years ago, the industry's expectations for the landing of AR HUDs were optimistically set at 2022. But now in 2022, although many so-called "AR HUD" products have landed, the mass production car that can really present the AR effect obviously has to wait for a long time, otherwise it will not rush to launch "meal replacements" to the market.

In this regard, Wei Hongxuan, co-founder of FUTURUS Future Black Technology, said: HuD technology has matured, but AR HUD is far from mature.

So what makes AR HUD so hard to get forward?

01

Technical threshold:

Virtual imaging is integrated with virtual reality

In the words of Xu Junfeng, founder and CEO of FUTURUS Future Black Technology, the most difficult thing about AR HUD is to present the virtual world through 3D light field display technology, as well as virtual and real-time fusion technology.

Different from the existing 2D version of "AR HUD", the key to AR HUD is to create a three-dimensional display in front of the driver's line of sight, which will be presented in a three-dimensional way like a holographic projection, which has also become the focus of attention of the current head players in the HUD field.

How long does the "meal replacement" of AR HUD have to be "eaten"?

For example, mainland Germany, a traditional HUD supplier, has invested in startup DigiLens to focus on holographic waveguide technology to create three-dimensional imaging effects through a principle similar to a holographic sight; Panasonic and HUD startup WayRay have both chosen laser holographic projection technology to achieve three-dimensional and visual continuous zoom effects of virtual images.

How long does the "meal replacement" of AR HUD have to be "eaten"?

Apple, which has been gaining momentum in the AR field, has also shown in a published patent that it will output three-dimensional imaging effects through light field HUD technology. Similar to Apple, FUTURUS, as a domestic enterprise, also uses light field AR HUD technology, and even carried out related patent layouts as early as 2016, which is one step ahead of Apple. The light field AR HUD that FUTURUS plans to unveil during the Beijing Auto Show may be the closest HUD product to "true AR" that we can get close to at present.

How long does the "meal replacement" of AR HUD have to be "eaten"?

It can be said that in how to create a real AR HUD, everyone is racing against the clock, and the competition is extremely fierce.

At the same time, after creating a three-dimensional virtual world, it is necessary to further solve the high-level problems of virtual and real fusion such as delay and jitter. This fusion includes both the fusion of space and the fusion of time.

To this end, FUTURUS uses its self-developed graphic correction, data compensation, and 3D rendering technologies to 3D present the HUD phantom image and the surrounding environment information, and align and display them in the corresponding spatial positions. When the virtual image and the real scene are perfectly integrated, the HUD picture virtual image is like "growing in" the environment, and there will naturally be no violation and unreality.

In terms of time, AR HUD for the display of instrument information is real-time, especially when the AR HUD and navigation and ADAS system through, route guidance, lane keeping warning, front obstacle warning and other more information need to be displayed, huD virtual image part of the display must be with the vehicle sensor, processor, vehicle information and the surrounding environment real-time synchronization, the delay is controlled within the millisecond range perceived by the human eye.

How long does the "meal replacement" of AR HUD have to be "eaten"?

Of course, in addition to the core level of virtual image processing and virtual reality fusion, AR HUD also faces many common problems encountered by HUD before, such as the choice of different PGU (imaging unit) technology routes, which determines the imaging effect and direct cost, but more depends on upstream material suppliers; the strict requirements for the accuracy of core component optical curved mirrors are also tested by upstream manufacturers of AR HUDs.

02

A milestone for HUD suppliers: mass production on board

What are the milestones for HUD vendors and startups?

"Mass production on the bus." Wei Hongxuan replied.

In his view, huddling on the bus is a process of exploring sub-optimal solutions. As a product with optical design as the core, its precision and complexity are even greater than that of the camera, and the slight structural adjustment affects the final effect. This also makes huD products more difficult to get on the car, no less than the early research and development investment.

Japanese Seiki, which occupies half of the HUD market, has revealed that they began to build a MERCEDES-Benz S-class HUD from 2018 to the end of 2020 Mercedes-Benz S-Class mass production, after nearly three years, which is based on the japanese seiki in the field of traditional WHUD for many years of mass production experience.

How long does the "meal replacement" of AR HUD have to be "eaten"?

Coincidentally, the HUD function of a new domestic car-making force has gone through a full four years from planning to get on the car to the final mass production. During this period, each slight adjustment of the model and the change of the windshield angle need to adjust the HUD optical path design scheme accordingly, especially the optical curved mirror of the core component; the change of the body sensor layout, the data access of the HUD also needs to be adjusted accordingly. This also dooms both HUD and AR HUD to be difficult to mass-scale and modularize.

Tailor-made, limiting the pace of the AR HUD's final ride.

How long does the "meal replacement" of AR HUD have to be "eaten"?

For the landing time of the real AR HUD, domestic and foreign companies have invariably given similar times. In FUTURUS' view, from the release of the light field AR HUD this year to the final application on the mass production car, it will take about 3 years, that is, 2025. Coincidentally, Panasonic, which demonstrated laser holographic projection AR HUD technology at 2022CES, also set the landing expectation of the product at 2025.

It seems that although many people regard 2021 as the first year of AR HUD, we still need to wait for several years to use the real AR HUD.

When humans don't need to drive,

Where is the end of AR HUD?

Whether it's the HUD in the cab of a fighter jet or the HUD on today's smart cars, the biggest role is to present key information in front of the line of sight, reducing the risk of accidents caused by brief line of sight deviations when looking down at the dashboard.

The development of AR HUD and autonomous driving seems to contradict each other: if one day in the future, fully autonomous driving is realized, and no longer need people to drive, then do we still need AR HUDs? What does AR HUD do?

Musk's often referred to "first principles" applies to HUDs as well. From CHUD to WHUD to future AR HUDs, although they have different forms and effects, the mission is unchanged: as a display carrier in the cockpit, to provide convenience for humans and meet the interaction needs of users.

Therefore, when autonomous driving gradually liberates the driver and human needs change, the role of the HUD will naturally change.

01

Information Simulator Phase:

Driving safety is a top priority

At this stage, humans hold the absolute initiative in driving, and the role of intelligent driving systems is to provide early warning and auxiliary intervention in special cases. It is huddle's mission at this stage to display the necessary information for driving intuitively and without interfering with the driver's vision, making driving safer.

Therefore, the current HUD is more like an information simulator, displaying instrument information such as speed and speed limit, while displaying navigation and some ADAS information to assist driving.

For example, AR HUD is combined with LDW lane departure alarm system, ACC adaptive cruise system, etc., marking the front lane, giving early warning tips when deviating, marking the location of the vehicle in front, and adjusting the distance according to the distance of the vehicle in front. Even on the ideal L9, the HUD is no longer an "optional" solution for meter information backup, but directly replaces the existence of physical dashboards.

How long does the "meal replacement" of AR HUD have to be "eaten"?

(The ideal L9 replaces the HUD of the physical dashboard)

02

Intelligent System Visualization Phase:

Enhance trust between people and systems

When the level of intelligence of vehicles gradually increases, more and more scenes can be completed independently by intelligent systems, and only need to be taken over by people when certain conditions are triggered, which is the so-called L3 level automatic driving stage. Although we can't strictly call the pilot driving assistance function L3 autonomous driving at this time, they face the same problem: when will the car be managed by the system and when will it be managed by the person? When the autonomous driving system encounters a crisis, how can we, as the driver, know in time that the system needs to be taken over?

How long does the "meal replacement" of AR HUD have to be "eaten"?

(Visualization and Wifi hierarchical early warning function proposed by Zhiji L7)

This requires the establishment of a visual medium between people and vehicles to display the status of the system in real time. Does the system detect a red light in front of you? Have you detected a bike that suddenly broke in? Is it impossible to complete the sharp turn and need someone to take over? With the visualization mode of the AR HUD, the status of the system is clear at a glance. The trust between people and the system is also established in this process of knowing oneself and knowing the other.

03

The Era of Driverless Driving:

A "window" for communication inside and outside the car

When fully autonomous driving is achieved, instrument information, auxiliary warning, system status does not seem to be so important, and we fully trust the decisions made by the vehicle. In this case, the entertainment functions and screen design of the cockpit will also increase, and the car will become a real mobile "third space".

However, even if there are more screens in the cockpit, the window is still a natural channel for people to communicate with the outside world in the car. Just like the LCD TV screen in the living room, no matter how big, it can't replace our yearning for the scenery outside the window. At this time, it is natural to use the natural screen of the car window to project information and display the surrounding cafes, restaurants, and shopping malls.

How long does the "meal replacement" of AR HUD have to be "eaten"?

When the car window becomes a window for communication inside and outside the car, the interaction between people and the world will no longer be limited to the flat world, but through the depth of the AR HUD to create an immersive experience, so that the people in the car are not trapped in the car and the screen.

At last

If in the WHUD era, the HUD market is more of the world of foreign companies; then under the blessing of intelligence, AR HUD with richer attributes will pull everyone back to the same running line, allowing us to see the situation in which domestic startups and traditional suppliers go hand in hand.

Today's underperforming AR HUDs seem to be an inevitable and awkward stage in the evolution of the HUD industry. We can't deny ar HUD itself, but we also look forward to the early arrival of real AR HUD. Just like Brother Atang, who drives the BMW concept car Vision Efficient Dynamics in the movie "Mission Impossible 4", you can load the navigation map on the windshield at any time and switch the interface.

How long does the "meal replacement" of AR HUD have to be "eaten"?

Do you look forward to the future of such smart cars?

Read on