Intel's first statement – While abstractly optimistic about the possibilities of the metacosm, Intel raises a key problem with implementing the metacosm: there is barely enough processing power to solve it.
Raja Koduri, senior vice president and head of Intel's Accelerated Computing Systems and Graphics Division, said, "Probably the next major computing platform after the World Wide Web and Mobile. But Koduri soon poured cold water on the upcoming metaverse: "Our computing, storage, and networking infrastructure today simply isn't enough to realize this vision," and crucially, Koduri doesn't even think we're close. He said our current collective computing power needs to be increased by a factor of 1,000.
A lot of Metavalse hype is built around what you'll be doing there, whether it's virtual reality conferences, digital concerts, and of course blockchain and NFT-based integrations. The future of virtual and augmented reality headsets is also exciting, whether it's Meta's Quest product (formerly known as Oculus) or Apple's long-rumored AR/VR device.
But metaverse's actual building blocks aren't just software and virtual spaces, but even the headsets and gadgets people wear. It will exist in computers and servers running huge shared virtual worlds, which Metaverse sees as the future of technology. It's there that Intel has the biggest reality check: Today's computers are just not powerful enough to make those dreams come true.
Meta's Horizon Worlds can accommodate up to 20 participants in one space. State-of-the-art VR technology still requires thousands of dollars in PC gaming hardware, but with many drawbacks. Even the largest traditional video games like Fortnite or Battlefield 2042 that don't handle the add-on needs of VR can only handle 100 to 128 players at a time.
As Koduri points out, we can't even use today's technology to put two people in a truly detailed virtual environment. "Consider what it takes to put two people in a social environment in a completely virtual environment: convincing and detailed avatars with realistic costumes, hair, and skin tones—all rendered in real time and based on capturing real-world 3D objects, gestures, audio, and more; data transfer at ultra-high bandwidth and extremely low latency; and a durable model of the environment that may contain both real and simulated elements."
Of course, Intel also has a vested interest in saying we need more and better computers and servers. After all, Intel makes CPUs for consumer devices and data centers, among others. If the metacosm needs to increase computing power by a factor of 1,000, then this is a good thing for businesses. It is no coincidence that Intel explicitly mentions its client computing and cloud processors and graphics products in its Metaverse Profile.

But the problem is, even Intel doesn't think that hardware alone can make us 1,000 times better. As Koduri explained in an interview with Quartz, the standard Moore's Law curve will only allow us to achieve about 8x or 10x growth over the next five years.
Instead, Koduri optimistically predicts that algorithm and software improvements will close the gap. Things like machine learning-driven neural networks, or AI-enhanced computing technologies that Intel already uses for things like Deep Link technology or the upcoming XeSS supersampling, are scheduled to debut with Arc GPUs early next year. That's a big problem, though — Intel is counting on algorithms or AI to increase computing power by a hundredfold (or more) on top of the growth offered by its existing hardware roadmap.
Koduri noted in the same Quartz interview that improved software and algorithms are not only necessary to close the gap in the ambitious five-year frame he proposes; they are essential to help mitigate the increased energy consumption that attempts to solve tough problems, which he compares to the current problems of cryptocurrency mining today.
Many big tech companies believe that AI and machine learning will solve their computing problems, from improving smartphone cameras to providing upgraded game visuals, and think they may be attractive. However, relying on them to improve tomorrow's computations by 100 times still seems like a daunting task, and it is expected that computations will only be 10 times faster based on hardware improvements alone.