laitimes

How are realistic real-time "digital people" produced?

How are realistic real-time "digital people" produced?

Text: Reliable A star

Digital technology is the engine of the information age, and the current AR, VR and 3D technologies have re-aroused people's attention, which will surely reshape the information world with visualization and high precision.

For example, some of the new chinese comic films we have seen in recent years can no longer be regarded as cartoons, and the light and shadow rendering and realistic effects of their characters and scenes are enough to be more shocking than the movies in which live-action films appear; the digital technology development team behind the scenes relies on scripts and story lines to produce DEMO to find intended investors, which not only reduces the risk of capital, but also allows the audience to see more cutting-edge and novel works.

In fact, in addition to movies, there are also some avatar IP, advertising films, building bridge architectural design, factory planning drawings, interior decoration panoramas, car (panorama) bookings, etc. that need to use digital technology of 3D and AR visual design to present the expected realistic blueprint effect or create an interactive interface for users to interact online.

What many people don't know is that this digital technology and tools were first applied in game development after maturity, and gradually "spilled" into other industries and application scenarios, and the underlying development work and operating system is Unreal Engine (Unreal Engine, hereinafter referred to as "UE"), this company can be said to be Online Games and Visual Design field of Android or Apple, the outside world has been difficult to see its appearance.

We further asked whether UE could use similar technical means to simulate people in the digital world rather than cartoon animation, similar to the three-body environment in the VR game of "Three-Body". The answer is yes, the relevant technical conditions and tools are fully available, because the fidelity of the characters in a large number of game and movie scenes requires that the corresponding development tools must make the faces, eyes, skin, hair, and movement of light and shadow changes that are sufficiently delicate and realistic.

How are realistic real-time "digital people" produced?

The so-called "realistic, real-time digital people" is not a virtual creation of IP or two-dimensional image, in fact, real people, realistic, real-time "digital people", A Xing understands that the real people are three-dimensional, similar to the digital "Madame Tussauds's House", and can adapt to ordinary people, so its technological breakthroughs and iterations may be of great significance to the popularization and popularization of AR, VR and so on.

How are realistic real-time "digital people" produced?

(Wax figure of a live-action star on display in Madame Tussauds' House)

How exactly are realistic, real-time digital people produced, and in which scenarios will they be applied, and where are the current bottlenecks in development? The author has the honor to consult Song Zhen, assistant dean of the Industrial Research Institute of Chinese Min University and director of the Cultural and Technological Integration Center, and Weng Dongdong, a researcher at the Institute of Optoelectronic Information Technology and Color Engineering of the School of Radio, Film and Television, Beijing Institute of Technology, who are the authorities of relevant professional research in colleges and universities, and are at the forefront of the presentation and exploration of "digital people".

First, use the game development of a full set of tools and systems to do "realistic, real-time digital people"

First of all, Unreal Engine is an operating system developed by the game company Epic Games, which can provide game developers with a large number of core technologies, data generation tools and basic support needed in game development, that is, UE makes game developers do not need to write code from 0 to do each frame of the screen or design the corresponding scene, so that game developers can focus more on the story and difficulty settings, return to the game itself, and make development easier.

In the digital world, this requires UE itself to continuously iterate technology and toolkits to meet the needs of online digital world gamification, interactivity and realism and dynamics of the game world, and to use more and more special effects and realistic image processing to make the picture more realistic and more detailed.

By 2021, Unreal Engine has been updated to UE5, based on the core technologies of Nanite (virtual micropolar geometry) and Lumen (dynamic global illumination) and supporting hundreds of millions, billions or even unlimited triangle rendering, which not only provides a broad stage for game development and designer creativity, but also can be widely used in people, things, stories and various content.

In the game world, the most difficult to show is still a person, because if it is a real, realistic person, his face or expression is changed with age, mood, state, in different scenes, people's eyes, light and footsteps, body shape, etc. are difficult to use virtual imagination to portray. Just as real-life portrayals are sometimes better than fiction, the amount of work it takes to record digital people in real time is actually much higher than that of fictional characters. For example, in December, NetEase Cloud was listed in Hong Kong with two digital people "Ding Lei" ringing the bell online, and some netizens believe that it is Ding Lei's beauty version, in fact, there is a gap between this and the real "digital person", especially in the facial expression, the audience can still see at a glance that it is "not realistic enough".

How are realistic real-time "digital people" produced?

The most important concern that people judge whether a digital person is realistic or not, vivid or not is in the expression. Song Zhen said: "People are very sensitive to expression differences, people have more than 70 muscles on their faces, which can produce many complex and subtle changes, and the digitization of faces and expressions is the focus and difficulty of digital people." ”

To create a realistic real-time digital person, it is first necessary to "photograph" the model or prototype character to collect the work, mainly based on the twin digital human scheme of measurement photography, through the spherical echelon lighting to collect the micro-expression changes of the model under different light and shadow, through the material (photo is a RAW file) for color calibration, keying and facial data processing and rendering after processing in the system, it is understood that a realistic real-time digital person may have 200 G original materials.

According to Teacher Song Zhen, the realistic real-time digital human creation process is mainly divided into nine steps, such as "light and shadow acquisition", "data cleaning", "re-topology", "texture processing", "binding", "hair formulation", "import UE", "lighting and atmosphere", "dynamic testing", etc., which require the deep participation of UE when combined.

How are realistic real-time "digital people" produced?

(Teacher Song Zhen gave a speech on the production process of realistic and real-time digital people)

To enable professional manufacturers of high-precision real-time digital people to maintain real-time changes in expressions and corresponding keying processing, light and shadow changes, sound matching, etc. all need to use Metahuman tools. "Unreal Engine launched the Metahuman tool in the first half of 2021 and quickly set the industry benchmark and standard in the field of 3D digital people," said Mr. Song Zhen. ”

Because Metahuman not only has powerful creative tools such as digital human material system, animation system, etc. to reduce project complexity, but also can use optical scanning equipment, Metahuman's technical pipeline can greatly improve production efficiency and performance, and the current digital human production pipeline can also be migrated to Metahuman, achieving good compatibility.

In the process of creating digital people, not only one or two DCC (digital content generation) tools are used, but also a lot of complex and complete ecosystems. Teacher Weng Dongdong said, "Unreal (engine) is not only a tool for real-time visualization, but also a platform for the entire realistic real-time digital person, even calling it an operating system. Because all digital people (assets) have to use UE to generate and turn them on, Unreal Engine actually brings together a variety of tools that are the most valuable to the operating system. ”

Unreal Engine's convenience in the production of 3D digital content is like a chef who wants to create delicious dishes for customers, without having to personally buy various kitchen utensils or go to the vegetable market to buy vegetables, many desks, pots and pans, and spices are already prepared, making it easier for chefs to play their cooking skills.

Second, high-precision and realistic real-time digital people have been able to produce, and the breakthrough is the digitization of historical celebrities

As the "photos" and "images" presented in 3D, the realistic digital human being will gradually become a reality and be applied to VR (virtual reality) and AR (augmented reality) devices as various digital human generation technologies mature.

Teacher Song Zhen stressed in his speech that the combined rendering quality of Unreal Engine's Metahuman and RTX (ray tracing technology) will be better, and the future will be deeply combined with AI technology, and the future high-precision Unreal people will soon appear, and the customization of virtual people will be the first to be applied in live broadcasts, concerts and other performance scenes.

In addition to the appearance of digital people in real scenes, such technology can actually help people to restore some famous historical figures, so that historical IP into people's real world, such as the portraits of some famous historical figures we have seen in history, many of which are copied by future generations of painters looking for people with similar looks, in fact, realistic digital human technology is expected to cooperate with many museums and related institutions in depth.

Teacher Weng Dongdong said: "The team hopes to make a set of standard digital assets for Chinese celebrities, especially from the celebrities with photos in modern times to promote." At the Unreal Engine Technology Open House, he presented the team's sample case of Mr. Mei Lanfang's digital human production.

Teacher Weng Dongdong said, "Beijing Institute of Technology and the Central Academy of Drama are responsible for reproducing Mr. Mei Lanfang through Gao Bao's real-time digital technology, which will become the first set of real Peking Opera digital human projects in China." ”

How are realistic real-time "digital people" produced?

It is understood that Mr. Weng's team has carried out a lot of research and restoration work on Mr. Mei's video materials, photos and stage scenes, from the process, action, clothing, and cultural play to research, through the carving to build a high-precision solid model, and then use UE digital engraving to repeatedly refine the scanning data, based on the construction method of the guide line to carry out high-fidelity hair bunches, etc., to build high-fidelity clothing based on real three-dimensional tailoring, etc., and finally produced Mr. Mei Lanfang's 3D digital person.

The digital person will combine VR and AR to present it in front of the audience in three-dimensional form, assuming that the user wants to watch Mr. Mei Lanfang's performance not on the TV screen or projection film, but face-to-face communication with the "real person".

Teacher Weng Dongdong said: "Anyone who needs to have a real person or a real scene can be restored using digital twin technology. From the perspective of commercialization, in fact, digital people themselves are also a production tool, which can use high-fidelity digital people as digital assets or IP to create film and television dramas or shoot videos. And where there are people, where people are used to produce content, digital people can come in handy. ”

At present, in China, the digital human industry chain and research and development have been landed and promoted in major cities such as Beijing, Shanghai, and Shenzhen. Teacher Weng Dongdong predicted, "Real and realistic digital people will become the development direction in the next three years." ”

Of course, the corresponding intellectual property rights and legal improvement of real-time and realistic digital people themselves are also the research topics of the two teachers, after all, technological development often requires corresponding supervision to keep up with the pace at any time.

epilogue

From the perspective of human history is the re-presentation of some people, realistic real-time digital people have realized the real-time presentation of people from two-dimensional electronic screen to three-dimensional real-time presentation, which is indeed a remarkable technological development; its production is completed with the help of Unreal Engine (UE) development tools and operating systems, and the future will gradually become lighter and simpler, lowering the entry threshold on the professional threshold, so that more people real-time remote accurate presentation is no longer difficult, whether it is film and television creation or commercialization there is a huge imagination space, The generation, protection, circulation and use of the corresponding assets of digital people will gradually attract attention. In the face of this technological innovation, which will have a profound impact on the business world, it is still in the early stages of active exploration.

▼About the Author

Reliable A Xing (Li Xing), public account: Reliable A Xing, author of the book "Media Strategy"

Read on