laitimes

Can AI escape the "energy crisis"?GPT-6 may be stuck by electricity!

Can AI escape the "energy crisis"?GPT-6 may be stuck by electricity!

"Science and Technology Innovation Board Daily" on March 27 GPT-5 has not yet been launched, and OpenAI seems to have begun to train GPT-6, but electricity may have become a "bottleneck" problem.

Kyle Corbitt, co-founder and CEO of AI startup OpenPipe, revealed that he recently spoke with a Microsoft engineer working on the GPT-6 training cluster project, who complained that deploying infiniband-level links between GPUs across regions was a pain.

When asked why the training clusters weren't clustered in the same area, the Microsoft engineer replied, "Oh, we've tried to do that, but if you put more than 100,000 H100 GPUs in one state, the grid will collapse." ”

For reference, a report by market research firm Factorial Funds shows that OpenAI's Wensheng video model Sora needs 720,000 H100s during peak periods — a number that, according to the engineers, is enough to crash the power grids in seven states.

▌Can the data center be profitable and how long is the construction time?

AT THE JUST-CONCLUDED S&P GLOBAL CAMBRIDGE ENERGY WEEK (CERAWEEK) 2024, ENERGY INDUSTRY EXECUTIVES FROM AROUND THE WORLD TALKED ABOUT THE ADVANCEMENT OF AI TECHNOLOGY IN THE INDUSTRY AND THE HUGE DEMAND FOR ENERGY FROM AI.

"By 2030, AI will consume more electricity than households. Toby Rice, CEO of EQT, the largest natural gas producer in the United States, cited such a prediction in his speech.

Bill Vass, vice president of engineering at Amazon Web Services, points out that the world adds a new data center every three days.

Bill Gates said that electricity is the key to determining whether a data center can be profitable, and the amount of electricity consumed by AI is staggering. The use of AI will drive up energy demand, and the development of AI in the coming years may be constrained by chip design and power supply.

This is not unfounded – the gap between supply and demand is already starting to emerge as new data centers are being built faster than new power plants. CBRE Group, Inc., a U.S.-based commercial real estate services company, has extended the construction time of the data center by two to six years due to delays in power supply.

▌ "Energy Behemoth"

The title of AI "energy behemoth" is not for nothing.

OpenAI's Sam Altman has complained about AI's energy needs, especially electricity demand. At the Davos Forum at the beginning of the year, he said that the development of AI requires breakthroughs in energy, and AI will bring far more electricity demand than expected.

According to some data, ChatGPT needs to consume more than 500,000 kilowatt hours of electricity every day to process about 200 million user requests, which is equivalent to more than 17,000 times the daily electricity consumption of American households, and as for the search giant Google, if it calls AIGC in every user search, its annual electricity consumption will increase to about 29 billion kilowatt hours, which is even higher than the annual electricity consumption of countries such as Kenya and Guatemala.

Looking back at 2022, before such a large-scale boom in AI, data centers in China and the United States accounted for 3% and 4% of the total electricity consumption of their respective societies, respectively.

With the gradual growth of the scale of global computing power, Huatai Securities predicted in a research report on March 24 that by 2030, the total electricity consumption of data centers in China and the United States will reach 0.95/0.65 trillion kWh and 1.7/1.2 trillion kWh respectively, which is more than 3.5 times and 6 times that of 2022. Under the optimistic scenario, the amount of AI electricity consumption in China and the United States in 2030 will reach 20%/31% of the electricity consumption of the whole society in 2022.

Analysts further note that because data centers are not evenly distributed, regional power outages will be the first to arise (e.g., Virginia in the United States). Considering that there has been almost no growth in electricity in the United States in the past, AI will be an important driver for the return to positive growth of electricity in developed overseas regions.

▌Where does the power increment come from?

Under the wave of global carbon neutrality, clean energy represented by photovoltaic and wind power seems to be the first choice, but this is only an "ideal choice".

"We can't build 100 gigawatts of renewable energy (power plants) in a few years. It's a bit tricky. Former U.S. Secretary of Energy Ernest Moniz admitted.

EQT CEO Toby Rice added that tech companies need reliable enough electricity that renewable energy sources such as wind and solar can't do, and that large nuclear facilities (with only one currently under construction in the US) have historically been expensive and time-consuming to build. "Tech companies aren't going to wait 7-10 years for this infrastructure, they're going to have to use natural gas. ”

The executive, who is from the U.S. gas giant, said that there have been tech companies building data centers asking about buying gas from EQT, and Rice was asked, "How fast are you delivering it?" and "How much gas can we get?"

▌U.S. stocks "no longer hidden corners"

First there is a "GPU shortage", and then there is a "power shortage", and the development of AI is really difficult to say that it is smooth sailing.

It is worth noting that U.S. stock investors who want to seize the wave of AI have set their sights on this corner.

Vistra Energy, one of the largest power producers and retail energy suppliers in the United States, Constellation Energy, the largest energy company in the United States, and NRG Energy, the largest green power company in the United States, have all more than doubled their stock prices in the past year, and all hit record highs this week.

Can AI escape the "energy crisis"?GPT-6 may be stuck by electricity!

Judging from the range of gains and losses in the past year and this year, although these three companies are not as good as Nvidia, the "strongest stock on the surface", they have also thrown off Microsoft, the "company behind OpenAI".

Can AI escape the "energy crisis"?GPT-6 may be stuck by electricity!

(Science and Technology Innovation Board Daily, Zheng Yuanfang)

Read on