laitimes

Is AI sucking up the world's electricity? Something worse is yet to come

Is AI sucking up the world's electricity? Something worse is yet to come

In recent years, the rise of artificial intelligence (AI) has caused widespread discussion and concern, many people are worried that AI will cause the unemployment rate to soar, and some optimistic friends joke that "as long as the electricity bill is more expensive than steamed buns, AI will never be able to completely replace people".

Although this is a joke, behind it is the real problem of AI energy consumption, and more and more people are worried that high energy consumption will become a bottleneck restricting the development of AI. Not so long ago, technology entrepreneur and former Google engineer Kyle Corbett

(Kyle Corbitt)

On social media X, it was stated that Microsoft has encountered difficulties in this regard.

How much power does AI really use?

According to Corbett, Microsoft engineers who trained GPT-6 are busy building the IB Network (InfiniBand) to connect GPUs distributed across different regions. The job was difficult, but they had no choice because if more than 100,000 H100 chips were deployed in the same area, the grid would collapse.

Is AI sucking up the world's electricity? Something worse is yet to come

Source: X@corbtt

Why do these chips come together and cause the power grid to collapse? Let's do the math.

Is AI sucking up the world's electricity? Something worse is yet to come

The stock copyright picture, reprinting and using may cause copyright disputes

According to data published on NVIDIA's website, each H100 chip has a peak power of 700W, and 100,000 H100 chips can have a peak power consumption of up to 70 million W. In the X comment section, some energy industry practitioners pointed out that the total energy consumption of 100,000 chips will be equivalent to the entire output of a small solar or wind power plant. In addition to this, the energy consumption of the supporting facilities of so many chips should also be considered, including servers and cooling equipment. With so many power-consuming facilities, concentrated in a small area, the pressure on the power grid can be imagined.

AI consumes power, the tip of the iceberg

The New Yorker's report on the issue of AI energy consumption once attracted widespread attention. The report estimates that ChatGPT may consume more than 500,000 kilowatt hours of electricity per day. (See: ChatGPT consumes more than 500,000 kWh of electricity per day, and it is energy that is stuck in the development of AI?)

In fact, the current power consumption of AI, while seemingly astronomical, is still nowhere near as much as cryptocurrencies and traditional data centers. The challenges encountered by Microsoft engineers also show that it is not only the energy consumption of the technology itself, but also the energy consumption of supporting infrastructure and the carrying capacity of the power grid that restricts the development of AI.

According to a report released by the International Energy Agency (IEA), the global electricity consumption of data centers, artificial intelligence, and cryptocurrencies reached 460 TWh in 2022, accounting for nearly 2% of global energy consumption. The IEA predicts that in the worst-case scenario, electricity consumption in these areas will reach 1000 TWh by 2026, which is comparable to the electricity consumption of the whole of Japan.

However, the report also shows that the current energy consumption for direct AI R&D is much lower than that of data centers and cryptocurrencies. Nvidia has a share of about 95% of the AI server market, supplying about 100,000 chips in 2023 and consuming about 7.3 TWh of electricity per year. But in 2022, the energy consumption of cryptocurrencies was 110 TWh, which is comparable to the electricity consumption of the entire Netherlands.

Is AI sucking up the world's electricity? Something worse is yet to come

Caption: Estimates of energy consumption for traditional data centers, cryptocurrency, and AI data centers in 2022 and 2026 (bar chart from bottom to top). As you can see, AI currently consumes much less power than data centers and cryptocurrencies. Image source: IEA

Cooling energy consumption should not be overlooked

The energy efficiency of a data center is typically assessed with Power Usage Effectiveness, which is the ratio of all energy consumed to the energy consumed by the IT load. The closer the energy efficiency ratio is to 1, the less energy the data center is wasting. According to a report published by the Uptime Institute, a data center standards organization, the average energy efficiency ratio of the world's largest data centers in 2020 was about 1.59. In other words, for every 1 kilowatt-hour of electricity consumed by IT equipment in a data center, 0.59 kilowatt-hours of electricity is consumed by its supporting equipment.

The vast majority of the additional energy consumption in data centers is used in cooling systems. According to a study study, cooling systems can consume up to 40% of the total energy consumption of a data center. In recent years, with the upgrading of chips, the power of a single device has increased, and the power density (i.e., power consumption per unit area) of data centers has been increasing, which has put forward higher requirements for heat dissipation. But at the same time, by improving data center design, energy waste can be significantly reduced.

Due to the differences in cooling systems, structural design, and other aspects, the energy efficiency ratio of different data centers varies greatly. According to the Uptime Institute, European countries have reduced their energy efficiency ratios to 1.46, while more than 1 in 10 data centers in the Asia-Pacific region still have energy efficiency ratios above 2.19.

Countries around the world are taking steps to push data centers to meet their energy conservation and emission reduction targets. Among them, the European Union requires large data centers to set up waste heat recovery equipment, the U.S. government invests in the research and development of more energy-efficient semiconductors, and the Chinese government has also introduced measures to require data centers to have an energy efficiency ratio of no higher than 1.3 from 2025, and increase the proportion of renewable energy use year by year to reach 100% by 2032.

Is AI sucking up the world's electricity? Something worse is yet to come

Caption: Energy efficiency ratios of large data centers around the world in 2020. From left to right: Africa, Asia-Pacific, Europe, Latin America, Middle East, Russia and the Commonwealth of Independent States, the United States and Canada. Image courtesy of Uptime Institute

Technology companies use electricity, and it is even more difficult to throttle and open source

With the growth of cryptocurrencies and AI, the data centers of major tech companies are expanding. According to the International Energy Agency (IEA), in 2022 the United States had 2,700 data centers, which consumed 4% of the nation's electricity consumption, and predicted that this proportion would reach 6% by 2026. As land on the east and west coasts of the United States becomes more and more scarce, data centers are gradually shifting to central regions such as Iowa and Ohio, but the original industries in these second-tier areas are not developed, and the power supply may not be able to meet the demand.

Some technology companies have tried to break free from the grid and buy electricity directly from smaller nuclear power plants, but both this and new nuclear power plants face complex administrative processes. Microsoft is trying to use AI to assist in completing applications, while Google is using AI to schedule computing tasks to improve the efficiency of grid operations and reduce corporate carbon emissions. As for when controlled nuclear fusion will be put into use, it is still unknown.

The warming climate is making things worse

The development of AI requires a stable and robust power grid, but with the frequency of extreme weather, power grids in many regions are becoming more vulnerable. A warming climate will lead to more frequent extreme weather events, which will not only cause a surge in electricity demand, increase the burden on the power grid, but also directly impact the grid infrastructure. According to the IEA report, the share of hydropower in the world fell to its lowest level in three decades in 2023 at less than 40%, due to drought, insufficient rainfall and early snowmelt.

Natural gas is often seen as a bridge in the transition to renewable energy, but it is not stable during extreme winter weather. In 2021, a cold wave hit Texas, causing widespread power outages, with some homes without power for more than 70 hours. One of the main causes of the disaster was the freezing of natural gas pipelines, which shut down natural gas power plants. The North American Electric Reliability Council (NERC) predicts that more than 3 million people in the U.S. and Canada will be at increasing risk of power outages between 2024 and 2028.

In order to ensure energy security while achieving energy conservation and emission reduction, many countries also consider nuclear power plants as a transitional measure. At the 28th summit of the United Nations Commission on Climate Change (COP 28) in December 2023, 22 countries signed a joint statement pledging to triple nuclear power generation capacity by 2050 to 2020 levels. At the same time, the IEA predicts that global nuclear power generation will reach a record high by 2025, as countries such as China and India aggressively promote nuclear power construction.

Is AI sucking up the world's electricity? Something worse is yet to come

Aerial photography of "Hualong No. 1".

The stock copyright image is not authorized to be reproduced

"In the face of changing climate patterns, it will become increasingly important to diversify energy sources, improve grid dispatch across regions, and adopt more shock-resistant power generation," the IEA report states. Ensuring power grid infrastructure is not only related to the development of AI technology, but also to the national economy and people's livelihood.

bibliography

[1] Kyle Corbitt. X. https://twitter.com/corbtt/status/1772392525174620355. <2024-03-26/2024-04-09>.

[2] IEA (2024), Electricity 2024, IEA, Paris https://www.iea.org/reports/electricity-2024, Licence: CC BY 4.0

[3] Andy Lawrence. Which regions have the most energy efficient data centers?. Uptime Institute.

https://www.datacenterdynamics.com/en/opinions/which-regions-have-most-energy-efficient-data-centers/. <2020-08-04/2024-04-10>

[4] Zhang, Xiaojing, Theresa Lindberg, Naixue Xiong, Valeriy Vyatkin, and Arash Mousavi. "Cooling energy consumption investigation of data center it room with vertical placed server." Energy procedia 105 (2017): 2047-2052.

[5] Evan Halper. Amid explosive demand, America is running out of power. Washington Post. https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/. <2024-03-07/2024-04-09>.

[6] Jeremy Hsu. US grid vulnerable to power outages due to its reliance on gas. New Scientist. https://www.newscientist.com/article/2411905-us-grid-vulnerable-to-power-outages-due-to-its-reliance-on-gas/. <2024-01-11/2024-04-09>.

[7] Jeremy Hsu. Much of North America may face electricity shortages starting in 2024. New Scientist. https://www.newscientist.com/article/2409679-much-of-north-america-may-face-electricity-shortages-starting-in-2024. <2023-12-23/2024-04-09>.

Planning and production

Author丨Maya Blue Science Creator

Audit丨Yu Yang, head of Tencent Xuanwu Laboratory

Planning丨Xu Lai Ding Kun

Editor丨Ding Kun

Reviewer丨Xu Lai Linlin

The cover image and the image in this article are from the copyright gallery

Read on