laitimes

How much computing power is needed to play metaverse? Intel: Now a thousand times more

From the Verge

Machine Heart Compilation

Machine Heart Editorial Department

It's time to pour cold water on the metaverse.

On the metaverse track, Intel is not absent.

Intel issued its first statement this week on the metacosm, saying there will be a virtual world parallel to the physical world in the future, but also raises a key problem: there is barely enough processing power to achieve the metacosm.

Raja Koduri, senior vice president and head of Intel's Accelerated Computing Systems and Graphics Division, said, "Metacosm could be the next major computing platform after the World Wide Web and mobile." But Koduri quickly poured cold water on the idea of the metaverse coming, saying, "Our computing, storage, and networking infrastructure today is simply not enough to realize this vision."

Crucially, Koduri doesn't even think we're close to being there. He said, "The computing power required to realize the metaverse will be 1,000 times the total computing power it has now."

How much computing power is needed to play metaverse? Intel: Now a thousand times more

Raja M. Koduri is Senior Vice President and General Manager of Intel Accelerated Computing Systems and Graphics (AXG) Group.

Much of the metaverse hype is built around what people do specifically, whether it's virtual reality conferences or digital concerts, and of course, blockchain-based and NFT-based integrations. There are also a lot of exciting developments in virtual and augmented reality headsets, including Meta's Quest, among others.

But the actual building blocks of the metacosm are not just software and virtual spaces, or wearable tools like headsets. The future of the metaverse will rely on huge clusters of computers and servers to run huge shared virtual worlds. Intel has the most real-world experience at this point, arguing that today's computers are not powerful enough to implement the metaverse, or even far behind.

Meta's flagship VR space, Horizon Worlds, can accommodate up to 20 participants. For the most basic World of Roblox-style animation, state-of-the-art VR technology still requires thousands of dollars in PC gaming hardware, and there are many flaws.

As Koduri points out, current technology can't even put just two people in a truly detailed virtual environment. Imagine what it takes to achieve this?

Realistic costumes, hair and skin tones... Everything needs to be rendered in real time and is based on capturing real-world 3D objects, gestures, audio, etc.;

Data transfer with ultra-high bandwidth and extremely low latency;

A durable model of the entire environment, which may contain real and simulated elements ...

This is only for a two-person virtual environment that simply can't scale to hundreds of millions of users, while Ready Player One, Snow Crash, or Matrix-style metaverse concepts require more computing infrastructure.

Of course, this is not a bad thing for Intel, because it requires a large number of more advanced computers and servers. As a company that makes CPUs and GPUs for consumer devices and data centers, if the metacosm needs to increase computing power by 1,000 times, then enterprises such as Intel will also benefit from it.

How much computing power is needed to play metaverse? Intel: Now a thousand times more

Intel's goal is to achieve 10 times the transistor density in the future, and then expand the chip area by 30-50%, while looking for ways to surpass classical transistors.

But the problem is that even Intel doesn't think that hardware alone can make us 1,000 times the hash rate. As Koduri explains in an interview with Quartz, "We think the standard Moore's Law curve will only allow us to grow about 8x or 10x over the next five years."

Instead, Koduri optimistically predicts that improvements in algorithms and software will fill the gap, such as machine-learning-driven neural networks or AI-enhanced computing techniques that Intel has already used for its Deep Link technology. Still, Intel's counting on algorithms or AI to increase computing power by a hundredfold (or more) on top of its existing hardware roadmap remains a huge problem.

Koduri also noted in an interview with Quartz that improving software and algorithms is not only necessary to close the implementation gap, but also critical to reducing energy consumption.

"Software will fill any gaps left by hardware" seems to be standing idly by. But the reality is that many big tech companies believe that AI and machine learning will solve their computing problems, and everything from improving smartphone cameras to providing upgraded game visuals will rely on AI.

However, in the case of hardware improvements that can only achieve 10 times the increase, relying on software and algorithms to achieve a 100 times increase in computing is still a difficult task. There is still a long way to go to truly realize the metaverse.

https://www.theverge.com/2021/12/15/22836401/intel-metaverse-computing-capability-cpu-gpu-algorithms

ETH Zurich DS3Lab: Building data-centric machine learning systems

The DS3Lab Lab at ETH Zurich is comprised of Assistant Professor Ce Zhang and 16 PhD and postdoctoral fellows, Ease.ML project: how to design, manage, and accelerate data-centric machine learning development, operation, and operation processes, and ZipML: Designing efficient and scalable machine learning systems for new hardware and software environments.

From December 15th to December 22nd, 11 guests from the DS3Lab Lab at ETH Zurich will share 6 sessions: Building Data-Centric Machine Learning Systems, as follows:

Read on