laitimes

Intel's metaverse plan starts with "borrowing" graphics

As the hottest topic in the field of science and technology recently, the existence of the meta-universe has given countless companies the opportunity to brag about their technical reserves and technical layout, and the giants in the semiconductor field such as AMD and Nvidia have successively announced news related to the meta-universe, and driven the stock price to soar, Intel has finally been unable to sit still recently.

In a recent report, Raja Koduri, senior vice president and head of Intel's Accelerated Computing Systems and Graphics Division (former chief architect of AMD Graphics), argued that "the metacosm may be the next major computing platform after the World Wide Web and the mobile Internet," while arguing that "our current computer, storage, and network infrastructure is simply not enough to achieve a true metacosm" and that "we need 1,000 times or more of the cluster computing performance of our current servers."

Intel's metaverse plan starts with "borrowing" graphics

Although Raja Koduri believes that the metaverse cannot be truly realized at present, it does not prevent Intel from laying out the relevant technologies of the metaverse, not to mention the infrastructure of the metaverse such as processors and servers, and Intel has also announced a new technology in the software field, and some comments believe that this technology may become one of the cornerstones of the metaverse.

Intel's metacosm, starting with software?

Recently, Intel announced a new game technology, the technology is called Continual Compute (continuous computing), the main purpose of this technology is to solve the problem of low-performance devices can not effectively run high-performance applications, so that any device can run AAA or even meta-universe level games and applications.

In the technology demonstration link, Intel let a laptop without discrete graphics successfully run "Killer 3", as a 3A game launched this year, "Hitman 3" has no low requirements for PC configuration, and even the general discrete graphics card cannot run the game smoothly.

With the help of Intel's continuous computing technology, the laptop calls up the high-end discrete graphics of the device from another device in the same network, and then transmits the data that needs to be computed to the device through the network, and then obtains the calculated results through the network and finally presents it to the user's display.

Intel's metaverse plan starts with "borrowing" graphics

Sounds familiar? Gamers familiar with PC gaming may first think of cloud gaming or streaming, both of which are not new today, but have not been well popularized for many reasons over the years.

It should be noted that although it sounds a bit similar to these two technologies, Intel's continuous computing technology is fundamentally different from the two at the application level. Whether it's cloud gaming or streaming, both run the game itself on a cloud device or another device, and then transfer the game footage to your device over the network.

Intel's persistent computing technology is to call on the high-performance hardware of other devices to participate in the operation, through the high-speed network and special software protocols directly transmit the operation data instead of the screen. The difference between the two is that the conversion of the result of the operation in the former is actually done on your PC, and your PC is part of the entire computing system, while in the latter system your PC is just a monitor, you are just watching a "video".

In the case of a poor network environment, the latter will have obvious problems such as image quality stuttering, and Intel's new technology can temporarily reduce the requirements for data transmission by reducing the number of frames and other means in the face of network fluctuations, etc., so as to ensure that the user's game experience remains smooth.

Intel's metaverse plan starts with "borrowing" graphics

Moreover, because it is a direct call device, some hardware-specific functions can also be supported, such as Nvidia's DLSS 2.0, etc., which are difficult to be reflected in streaming technologies such as cloud games, and with the help of Intel's continuous computing technology, there will be opportunities to get these advanced AI technologies directly on their laptops in the future.

So the question is, where do you get this device equipped with a high-performance graphics card and processor? Intel's answer is "requisitioning idle devices on the same network," provided, of course, that the other party "agrees to expropriate their own devices." In simple terms, there are two prerequisites for enabling this feature, one is under the same network, and the other is that the other party agrees that their device is requisitioned, and the relevant function can only be activated if both requirements are met.

Theoretically, as long as it is hardware that satisfies the software protocol, it can be included in this "rental equipment" network, whether the other party is AMD or NVIDIA hardware, as long as it is willing to support, it can be borrowed by your PC. However, in the technology preview currently displayed by Intel, only the protocol handshake between Intel's processors is supported, and there is no description of the type of graphics.

Judging from the game videos presented by Intel, there is no significant difference in fluency between games run through continuous computing technology and games that run directly on the local machine in the same picture quality. However, it is worth noting that the screen of "Killer 3" run by Intel is set to low, and Intel has not shown the performance of the technology in high quality, and from the explanatory documents provided by Intel, the technology is still in the development stage, and it is only a stage of achievement that is currently displayed.

Will continuous computing technology be one of the cornerstones of the "metacosm"?

After Intel demonstrated continuous computing technology, why would anyone call it the cornerstone of the future "metacosm"? Speaking of metaverses, we have to mention virtual reality technologies such as AR and VR, and only with the help of these technologies can we achieve a real "metacosm".

However, whether it is AR or VR or more comprehensive XR, the premise of providing services to users is that users need to have a device with enough performance to support the software, taking VR games as an example, a common VR game currently requires a device with performance at least beyond PS4 to run, and the effect is not enough for players to have an immersive experience.

If you consider the rendering screen and real-time feedback required for immersive experiences, the most high-end personal PCs can not meet the needs, so Intel executives will point out in related articles that we need to improve computing performance by more than 1,000 times to meet the requirements of the real "meta-universe" software operation.

Intel's metaverse plan starts with "borrowing" graphics

Although from the perspective of technological development, in the future, we will inevitably be able to manufacture devices that meet this performance requirement, but its initial volume and power consumption will certainly not be low, and it is unlikely to be carried by individuals and used at any time. If the metaverse users at that time want to get a complete game experience more easily, then they can use Intel's continuous computing technology to directly call the high-performance PC placed on the other side of the room through the network or rent a high-performance PC directly from the network.

Under the premise of using continuous computing technology, the user's own device only needs a computing processor with certain performance, a display device, and a network device to run applications that require high-performance hardware to run. In simple terms, continuous computing technology allows users to get the performance of a high-performance PC anytime, anywhere, to assist them in playing games or launching some applications, and in the metaverse world, this technology will allow players to get a complete experience at any time at a lower cost.

So, what are the advantages of Intel's technology over cloud computing servers such as cloud games? One is that the individual or service provider who provides the service does not need to build a huge online server, only needs to connect the device equipped with the relevant hardware into Intel's shared network, and the second is that the user obtains data from the nearest device, which can maximize the delay and packet loss caused by the long-distance transmission of data.

Moreover, in Intel's plan, this technology will form a pool of computing resources at the network level in the future, and allocate computing resources through the intelligent provisioning mechanism of Windows and other systems, and each idle computer accessing the network is part of this resource pool.

Intel's metaverse plan starts with "borrowing" graphics

Seeing this, Xiao Lei thought of idle PC resource utilization plans such as the Phoenix Project, in which users can get different degrees of returns by contributing resources. In the meta-world of the future, perhaps we will be able to buy computing resources in the resource pool as we would in traffic, without having to bear the costs and expenses of high-end hardware.

In addition, according to Intel, continuous computing technology is only one of their technical reserves in the metaverse, and they are exploring how to reduce the computing performance requirements of immersive experience content such as VR through deep learning, neural networks and other technical means to meet the needs of users.

However, Intel also admitted that in the next five to ten years, the performance of chips will still be difficult to meet the needs of real metaverse program operation, real-time interaction, ultra-fine real-time rendering and other issues for performance, network and other infrastructure are a huge challenge. So, the first use of this continuous computing technology may be to become another "cloud game" project, at least in the current Minecraft level "metaverse" games, this technology does not have much use.

Read on