laitimes

1 million people immersed in the "Cloud Bucket Dance"! Why did this show catch fire?

Reporting by XinZhiyuan

EDIT: Good sleepy peach

【New Zhiyuan Guide】May Day home, no Di can jump? You may have missed a super binge with virtual people!

Above is the big screen of the performance, below is the crowded discotheque, on both sides of the scroll is the confession declaration, and next to the "virtual" CCTV host Nigmat.

CCTV's May Fourth special program with its own topic degree rushed to the hot search that night.

Why is it on fire?

1 million people "Cloud Fighting Dance"

Back in 2021, the "TMELAND" Tencent Music Virtual World was officially unveiled on December 31.

That night, Mayday, which began to sing offline at the main venue of the Kaohsiung World Games, also landed on TMELAND, a "virtual music carnival".

1.1 million music fans around the world have successively "landed on the island" to celebrate the New Year's Eve. This magnitude is equivalent to 12 full-seated concerts in the Bird's Nest.

At this time, it was only five months after the establishment of the TMELAND project.

In the height of summer in July, the design team was suddenly tasked with creating a "virtual music world".

Just as there are ten thousand Hamlets in the hearts of ten thousand people, each person on the team has their own ideal virtual world.

When everyone was at a loss and hadn't even thought of a name, a designer began to try to hand-paint the entire park according to his own feelings.

I didn't expect to paint, and suddenly I had an epiphany, and I thought of the name TMELAND. Coupled with that blueprint, a clear "island" seemed to appear in front of everyone.

Subsequently, the technical team joined the development of the project in October.

As the design is gradually improved, problems follow, how can these effects be presented?

There are two paths ahead of the technical team, "terminal" and "cloud":

1. Put all the content and materials into a huge APP and run it on the user's device;

2. All rendering and other calculations are placed in the cloud, and users only need to access the network.

The first path is relatively mature, and a large number of games use this mechanism, such as the well-known "Peace Elite" or "Glory of Kings".

Putting aside the game's strict requirements for operation, it can be said that as long as you can access the network, there will basically be no high latency.

However, if the user's mobile phone or other terminal equipment is allowed to carry all the rendering pressure, not only does it require strong hardware performance support, but also after integrating the rendering engine and various material materials, the volume unit of the software installation package will also be calculated in GB, occupying a huge storage space, and the experience is not friendly.

The first road is not easy, so what about the second road?

Not to mention, this one is more difficult.

Users may seem to access at zero cost, but since all content is in the cloud, users must have a smooth experience at extremely high network bandwidth.

The cost of the operator is directly full, because each user needs to perform independent rendering and output, the operator must invest a huge amount of server resources to maintain.

Obviously, both of these paths are too much to back off for a brand new product that requires extremely high iteration speeds.

After a wave of brainstorming, TME's technical team chose the "third way".

Since pure "cloud" or "end" doesn't work, what if the two are combined?

For example, let the cloud be responsible for the rendering of large scenes, and let the terminal be responsible for lightweight interactions.

This is a very bold idea, after all, the way to split the whole process into two parts to deal with it is also the first time in China.

Moreover, TMELAND itself is a very innovative project, which requires that the solution must be able to achieve a very high update speed to adapt to different gameplay and content changes.

At this time, the pressure came to the technical team. Because for the combination of "end" and "cloud" to be truly available on a project, it is necessary to find a balance between iteration and performance.

After much deliberation, the joint team of TMELAND and Meta-Elephant XVERSE decided to use H5 to create this new virtual world.

Using the flexibility of the H5, the team not only optimized the codec system, but also used techniques such as texture compression when rendering pictures and some footage, significantly reducing overhead, thereby reducing the need for local performance.

However, the team was not sure about the feasibility of this plan at the time, so the first test of water was selected in the virtual live broadcast room of "K Song for All".

Unexpectedly, the effect was surprisingly good. Soon, a variety of new ways to play and scenarios came into being.

The encouraged technical team worked intensively for more than two months, and a new 3D interactive technology solution called "End Cloud Collaboration" was born.

With the advantage of "end-cloud collaboration", TMELAND can accommodate 100,000 people online at the same time.

More importantly, TMELAND can not only compress the virtual music world with an area of more than 130,000 square meters into a very small download package, but also users do not need auxiliary hardware, and even independent software does not have to be downloaded and installed.

As long as the mobile phone can be connected to the Internet, you can enter this amusement park anytime and anywhere.

On May 4, 2022, in this May Fourth special program linked with CCTV, more than one million users entered TMELAND as "digital people".

The host Nigmat's "twin" first came to the virtual world with a "cyberspace".

Immediately afterward, Ji Li, Tong Heguang and Xing Hitomi, which are owned by Tencent Interactive Entertainment, also collectively "parachuted" to the scene and co-starred in the sitcom "New Youth" with young actors.

With the camera switching, the virtual person also appeared on the campuses of Chinese Min University, Beihang University and Wuhan University, and danced with young students from Heilongjiang University, Xi'an Jiaotong University and Guangzhou University.

The path of virtual human growth from 0 to 1

The three virtual people in this performance have different styles, and if you look closely at their stories, you can roughly depict the path of Chinese Internet companies such as Tencent to develop virtual people.

Taking Xingyi as an example, as the NPC of "QQ Dance", Xingyi is now a virtual idol with a huge fan base, who can produce singing works, dance with dancer Yang Liping, and participate in variety shows.

With the maturity of the live broadcast pipeline behind it, Star Pupil has become the first virtual person to do full 3D live broadcasting with UE4 engine.

One of the great things about this process is that it can be rendered in real time, and what you see is what you get.

That is, it is necessary to make actions offline for 2 to 3 months, as long as the pre-assets are prepared, 2 to 3 weeks can be completed.

Based on this, Star Pupil also successfully completed the adjustment of 24-hour extreme live broadcast in 2021.

If Star Hitomi and Jilly of Peace Elite are NPCs who have entered the real world from the second dimension with the help of virtual human technology.

Tong Heguang is a virtual person between a hyper-realistic digital person and a high-fidelity digital person, and more importantly, the virtual human technology at the bottom of Tong Heguang has taken a completely self-developed road.

And this story has to start from the first generation of high-fidelity digital people "Siren".

In May 2017, Siren officially launched the project, which is backed by a joint development team from four countries. Tencent NExT Studios is the project owner and is responsible for rendering enhancements and performance optimizations for the Unreal engine.

During joint development, cross-border communication brought significant risks and challenges to projects, whether it was jet lag and cultural diversity, or repeated massive data collaboration. After nearly a year of research and development, Siren finally appeared at GDC 2018 in March of the following year.

But in the end, even if you have this project, the underlying technology behind it is still in the hands of others.

Obviously, What Tencent wants is more than just "reusing" the ready-made model.

As a result, Tencent began to try to use internal personnel and technical capabilities to develop virtual people, no longer relying solely on foreign technology.

High-fidelity digital people Matt, WL. Self-developed virtual people such as S members Tong and Guang were launched in this context.

In 2021, Tencent and Xinhua News Agency cooperated to start building the world's first digital astronaut from scratch.

Just 3 months later, the real-time high-fidelity digital human villain was born.

She has more than 5,000 micro-expressions, her skin can change with the light, and her body can rotate freely, leading people on Earth to 360 ° to see the rare Martian scenery.

To be able to do this, Tencent's two sets of self-developed pipelines are indispensable:

xMoCap motion capture animation full process production

xFaceBuilder digital character production

In the xMoCap pipeline, binding, motion capture, asset management, animation tools and NExT motion capture database are integrated, and one-stop services covering motion capture services, data processing, data output and so on are provided.

Originally, 2 to 3 people took a week to complete the dynamic capture data processing, and now 1 person can easily get it the next day after the completion of the movement capture, and the production efficiency of character animation has been greatly improved, and the quality has also been improved.

The xFaceBuilder digital character production pipeline has improved the efficiency of character face production through a large number of algorithm optimizations, and can realistically produce faces, hair, and clothing, and polished into a whole process of production covering modeling, binding, and animation.

Plus the self-developed live pipeline behind The Star Pupil, Gilly's self-developed skeleton binding pipeline... After five years of exploration, whether offline or real-time, Tencent's virtual people have achieved "thousands of faces" at the same time, and industrialized production has become more and more mature.

Virtual and real, how to go in the future?

In fact, "Fortnite" has also made many attempts at virtual concerts before.

Jay Chou's concert in the sandbox of the meta-universe platform is also a lot of fans.

However, these are pure virtual experiences, and the virtual music world experience created by CCTV and Tencent based on TMELAND is more based on the interaction of real evening parties and virtual experiences, which is a breakthrough in the traditional TV program experience.

And the current application of these virtual technologies is far from being limited to the virtual world, but is rapidly reusing them in the physical industry.

Doesn't feel intuitive? That's okay, just take an example.

I don't know if you have such a perception, now the development of AI is more or less affected by AlphaGo.

After all, DeepMind tells us that a mature gaming environment can be used to assist in the development of artificial intelligence.

In other words, the game in front of you is not just a simple game, but an "AI bootcamp" that can provide more scenes, guarantee faster speed, and lower cost.

It seems that it is not difficult to understand why a group of Tsinghua and Peking University students will study artificial intelligence in "Glory of the King".

For example, autonomous driving is the "crown" of artificial intelligence technology, and one of the important links is pavement testing.

By using digital twin technology to create a virtual-real interaction space-time, the test can be simulated in a "parallel world", which will greatly accelerate the arrival of the era of unmanned driving.

In 2018, Tencent launched the TAD Sim, a virtual simulation system for autonomous driving, which can efficiently build test scenarios with high degree of reduction and complete closed-loop simulation tests of perception, decision-making and control algorithms for autonomous vehicles.

Of course, the impact of digital technologies for creating virtual worlds on the physical industry will not be limited to AI and autonomous driving.

Another highly representative field is the digital culture industry.

The 2022 Beijing Winter Olympics will present the ethereal and romantic opening ceremony, and the wonderful moments of high-speed ice and snow sports to the world through new technologies such as 5G, ultra-high-definition video, free viewing, virtual reality, cloud rendering, intelligent creation and distribution.

In the film and television entertainment industry, with the application of game engines and the gradual maturity of virtual human technology, the traditional film and television production pipeline is also being subverted.

The new mode will no longer be limited by green screen technology and live-action shooting, and the intervention of virtual production in the pre-shooting stage can further expand the creative boundaries of the film and improve the process and efficiency of production.

In the field of cultural heritage protection, Tencent is also constantly exploring what it can do, and integrates the concept of digitalization into the collection, information storage and information dissemination of cultural relics.

In addition to satisfying your home dance, relying on Tencent's digital technology can also allow you to "climb" the Great Wall without leaving your home.

Cloud "tour" around the museum.

In December 2021, Tencent and the Beijing Municipal Bureau of Cultural Heritage launched the "Digital Axis" project. With the help of digital technology, some ancient buildings that have disappeared in the historical evolution will be restored in the virtual space. Let more audiences experience the unique charm of Beijing's central axis through virtual interaction.

To be sure, our exploration of the virtual world will not stop at building a virtual utopian world, and what we are more looking forward to is to create a better future for the integration of virtual and real life.

Read on