laitimes

What is the critical path to achieving the metaverse? | Q recommended

Author | Wang Yipeng

The hottest new concept in 2021 is the metacosm. On October 29, 2021, Facebook announced that it would change its name to Meta; on November 1, 2021, Roblox, the "first stock of the metacosm", announced that it would be back online after a brief adjustment. The next offline/online discussion about the metaverse is in full swing, and the heat of the metacosmonic concept can be seen.

1

Dismantle the metaverse layer by layer

Professor Shen Yang, professor and doctoral supervisor of the School of Journalism of Tsinghua University, shared at an event that the metaverse, which is Metaverse in English, is literally understood and consists of two parts: Meta (transcendence) and Universe (universe). Professor Shenyang's team also gave a relatively precise definition of the metacosm:

Metacosm is a new type of virtual reality fusion of Internet applications and social forms produced by integrating a variety of new technologies, it provides immersive experience based on extended reality technology, generates mirror images of the real world based on digital twin technology, builds an economic system based on blockchain technology, closely integrates the virtual world and the real world in economic systems, social systems, and identity systems, and allows each user to produce content and edit the world.

Beamable founder Jon Radoff disassembled the concept of metaverses at the industrial level: "Seven levels of metacosm construction: experience; discovery; creator economy; spatial computing; decentralization; human-computer interaction; infrastructure." ”

The experience level is relatively easy to understand, and at present, we often see companies in the fields of games and social networking, all of which work at the experience level. The famous game "Second Life" is particularly classic. In this game, users are called "residents" and can interact with each other through movable avatars. The program also provides a high-level social networking service based on a common metaverse. Residents can wander around, meet other residents, socialize, participate in individual or group events, and make and trade virtual property and services with each other. The typical Ready Player One is people's free imagination of the metaverse at the experience level.

The discovery layer is an important way for users to understand the experience layer, which includes various app stores, and the main players are large Internet companies;

Creator Economy: Helps metaverse creators monetize their achievements, including design tools, monetization techniques, animation systems, graphic tools, etc.

Spital Computing: Empowers the creator economy layer, including 3D engines, gesture recognition, space mapping, and artificial intelligence, with the main players being 3D hardware and software vendors;

Decentralization: Companies at this level mainly help metaverse ecosystems build distributed architectures to form a democratized structure;

Human Interface: The human-computer interaction layer is mainly a media tool for the public to contact the metaverse, mainly reflected in the tactile, posture, sound, nerve and other levels, of which products include AR/VR, mobile phones, computers, cars, smart glasses and other wearable devices, the main participants are 3D software and hardware manufacturers;

Infrastructure: 5G, semiconductor chips, new materials, cloud computing and telecommunications networks. The infrastructure layer is most likely a game between giants, most of which are basic hardware companies.

It can be said that the metacosm is a centralized export of the future needs of the entire human economy, including the user's desire for new experiences, the desire of capital for new exports, and the desire of technology for new fields, which is an inevitable new concept for the development of science and technology to a certain stage. Even if there is no "metacosm" in 2021, other concepts such as "meta-world" and "meta-matrix" may appear.

2

The key supporting technology of the metacosm

The above definition of the metaverse concept and industrial stratification is a concept that has recently been well known to many people, but this still does not explain the realization path of the metaverse, in the end, what we want to figure out most is how to realize the dream metacosm.

From the technical perspective, the key supports of each part of the metaverse can be referred to as "HNCBD", which are hardware, networking and computing, content and application ecology (Content), blockchain and NFT (Blockchain), and digital twins. Of course, these core technologies may have subtle differences in the eyes of different people, but the overall difference is not much.

In "HNCBD", H belongs to hardware and is not within the scope of regular discussion by software developers; C relies on a variety of application communities; and network and computing power, blockchain and NFT, digital twins, in fact, there is a unified form of bearing, that is, cloud computing.

Clear-eyed people have long seen that if the problem of repeated wheel building due to commercial competition is excluded, in fact, the best way to achieve the metaverse is the cloud. In a sense, the cloud not only carries the unprecedented demand for computing power and infrastructure in the metaverse, but also various PaaS and SaaS services on top of infrastructure. In the development process of the metaverse, if every application provider and content provider has to reconstruct the infrastructure, including basic data lake warehouse services, digital twin services, and machine learning services, the cost will be unimaginable.

At the current stage of cloud computing, in addition to providing basic computing power support, the most critical thing is to provide sufficiently mature technology products in the three directions of games, AI algorithms and VR, the most representative of which is Amazon cloud technology.

Looking back at the movie footage of Ready Player One, the actors put on their glasses and enter the game world, which is actually a typical cloud game scene.

At present, large-scale games adopt the implementation mode of server + client, and the hardware requirements on the client are relatively high, especially the rendering of 3D graphics, which basically depends entirely on terminal operations. With the advent of the 5G era, games will be rendered on a cloud GPU on a large scale, and the game screen will be compressed and transmitted to users through the 5G high-speed network.

On the client side, the user's gaming device does not require any high-end processors and graphics cards, only basic video decompression capabilities. From a game development perspective, game platforms can deploy new game features more quickly, reducing the build and testing effort required to launch the game and meeting player needs.

In September 2020, Amazon Cloud Technology launched its own cloud gaming platform Luna, compatible with PC, Mac, Fire TV, iPad and iPhone and Android systems, well-known game and platform manufacturer Epic Games is also using Amazon EC2 and other Amazon Cloud Technology services to expand capacity in a timely manner and support remote creators. Amazon G4 instances drive cloud game rendering through the GPU and transfer the most complex cloud games through the NVIDIA Video Codec SDK. The NVIDIA T4 GPU on amazon G4 instances is also the first GPU instance on the cloud to provide an RT core and support NVIDIA RTX real-time ray tracing.

The experience of the metaverse is not limited to cloud games, cloud games are just scenes, vr is the path.

The limitations of traditional VR applications are mainly reflected in four aspects, including: high cost of purchasing host and terminal hardware, low device usage, content dispersion, and limited mobility.

The combination of cloud computing and VR can migrate GPU rendering functions from on-premises to the cloud, making the design of terminals more lightweight and cost-effective, reducing the cost of purchasing hardware equipment. VR developers can quickly iterate on content publishing on the cloud, and users can play instantly without downloading, solving the problem of content inattention.

Using Amazon Sumerian as an example, developers can easily create 3D scenes and embed them into existing web pages. The Amazon Sumerian Editor provides out-of-the-box scene templates and intuitive drag-and-drop tools that enable content creators, designers, and developers to build interactive scenes. Using the latest WebGL and WebXR standards, Amazon Sumerian creates immersive experiences directly in a web browser and can be accessed in seconds via a simple URL while running on major hardware platforms for AR/VR.

In addition to cloud gaming and VR, there is another key variable in the implementation of the metaverse, which is AI. AI can shorten the time of digital creation and provide underlying support for the metaverse, mainly in computer vision, intelligent speech semantics, and machine learning. All three require huge computing power and storage, and cloud computing provides unlimited computing power and storage support for artificial intelligence. A company called GE Healthcare uses Amazon P4d instances to reduce the processing time of customized AI models from days to hours, which increases the speed of training models by two or three times, providing various telemedicine and diagnostic services.

The value of AI in the avatar is more obvious, Amazon Cloud Technology's AI services have a lot of application practices in this field, including image AI generation (automatic coloring, scene adjustment, image quadraticization), automatic model generation (automatic animation generation, scene prop generation), game robots (game AI NPC, text interaction, voice-driven lip-sync animation, action capture, expression migration), idol marketing operations (chat observation, popular collocation, anti-plug-in), etc.

3

The core work of cloud computing enterprises in the field of meta-universe

If the above dismantling is more of a theoretical analysis, then if we take a closer look at the recent dynamics of the head cloud computing enterprises, we will find that various technical supports for the metacosm are becoming a reality in the cloud.

2021 Amazon Cloud Technology re:Invent, Amazon Cloud Technology released Amazon IoT TwinMaker and Amazon Private 5G.

The former, which allows developers to easily aggregate data from multiple sources, such as device sensors, cameras, and business applications, and combines that data to create a knowledge graph that models real-world environments, is one of the technologies that make up the realization of the industrial metaverse.

The latter can automatically set up and deploy enterprise proprietary 5G networks, and expand capacity on demand to support more device and network traffic, focusing on the huge sensor and end-side device clusters based on Industry 4.0, and the industrial meta-universe and the Internet of Vehicles mentioned above are naturally in the same sequence.

Not to mention Amazon SageMaker Canvas, which uses a no-code concept to build machine learning models, make model predictions, and ensure that services can still be provided without the data engineering team, further lowering the threshold for future metaverse content production and ensuring content diversity.

Also during the 2021 Amazon Technology re:Invent Global Conference, Meta universe company Meta announced the deepening of its partnership with Amazon Cloud Technology as its strategic cloud service provider.

According to reports, Meta uses Amazon Cloud Technology's reliable infrastructure and comprehensive functions to supplement its existing local infrastructure, and will use more Amazon Cloud Technology's computing, storage, database and security services to obtain better privacy protection, reliability and scalability in the cloud, including running third-party cooperative applications on Amazon Cloud Technology and using cloud services to support enterprises that are already using Amazon Cloud Technology.

Meta will also use Amazon Cloud Technologies' computing services to accelerate the development of artificial intelligence projects in Meta AI's division. In addition, Amazon Cloud Technologies and Meta will work together to help customers improve the performance of PyTorch, a deep learning computing framework running on Amazon Cloud Technologies, and enable developers to accelerate the mechanisms for building, training, deploying, and running AI and machine learning models.

Zhang Wenyi, global vice president of Amazon and executive director of Amazon Cloud Technology Greater China, believes that this is an area where cloud computing can be greatly empowered. She said: "We think the metacosm must be an area where cloud computing can be a lot of empowerment. What the metaverse itself needs is computing, storage, machine learning, etc., all of which are inseparable from cloud computing. ”

4

The future is still being drawn

Will the technology stack of the metaverse expand in the future, and will the metacosm's presentation change dramatically?

The answer is almost certain, just like before the popularization of 4G mobile phones, we could not imagine the main types of applications in the 4G ecosystem. From construction to maturity, only at the level of computing power, the metaverse still has at least ten years to go.

However, paying attention to the updated iteration of metaverse support technology in the field of cloud computing may be an important way for us to put aside the bubble and observe the progress of the metacosmonic ecology.

What is the critical path to achieving the metaverse? | Q recommended

Read on