laitimes

ChatGPT consumes more than 500,000 kWh of electricity a day!

author:52 Hz laboratory

Unexpectedly, the end of AI is not computing power, but electricity!

You may not know how much power consumption ChatGPT consumes all over the Internet?

According to The New Yorker, ChatGPT could consume more than 500,000 kilowatt hours of electricity per day to respond to about 200 million requests from users.

ChatGPT consumes more than 500,000 kWh of electricity a day!

By comparison, the average U.S. household uses about 29 kilowatt-hours of electricity per day.

In other words, ChatGPT uses more than 17,000 times more electricity per day than a household.

Relevant data show that training GPT3 alone consumes 1.287 million kilowatt-hours of electricity, which is enough for 300 of our families for a year.

ChatGPT consumes more than 500,000 kWh of electricity a day!

And this is only the training stage, if it is a more advanced GPT4, it will consume more electricity, conservatively estimated to be at least tens of millions of kilowatt-hours of electricity.

It's no wonder that many bigwigs in the industry are worried about this.

Tesla CEO Elon Musk said, "More than a year ago, there was a shortage of chips, and next year you will see a shortage of power that will not be able to meet the demand for all chips." ”

ChatGPT consumes more than 500,000 kWh of electricity a day!

Altman, the founder of OpenAI, also said, "The future development of AI technology depends on the logistics of energy, and more photovoltaic and energy storage are needed to support AI computing power." ”

ChatGPT consumes more than 500,000 kWh of electricity a day!

It is said that Altman even paid out of his own pocket to open a power plant, injecting $375 million into a Helion company to gamble on their controlled nuclear fusion technology.

ChatGPT consumes battery

Some people may not understand, why is ChatGPT so power-hungry?

As a large-scale deep learning language model, ChatGPT requires a lot of computing resources to make inferences and generate text.

ChatGPT consumes more than 500,000 kWh of electricity a day!

In the process of using ChatGPT, the actual resource consumption is mainly concentrated on the server side, such as CPU and GPU.

In addition, it can take weeks or months to train a large language model, during which thousands of GPUs run continuously and consume a lot of power.

SemiAnalysis data shows that OpenAI needs 3,617 NVIDIA HGX A100s with a total of 28,936 GPUs to support ChatGPT inference.

ChatGPT consumes more than 500,000 kWh of electricity a day!

ChatGPT needs to respond to 195 million requests per day and is expected to consume 564 megawatt-hours of electricity per day, or about 2.9 watt-hours per request.

To give you a figurative example, even if you just type a few words casually on ChatGPT, OpenAI founder Ultraman will have to worry, and the electricity bill will become longer again!

ChatGPT consumes more than 500,000 kWh of electricity a day!

According to the Uptime Institute, the share of AI business in global data center electricity consumption will soar from 2% to 10% by 2025.

And with the widespread adoption of generative AI, it is expected that the entire AI industry will consume 85 to 134 TWh of electricity per year by 2027 (1 TWh = 1 billion kWh).

It is no exaggeration to say that this electricity consumption will be the same as that of the Netherlands, Sweden, Argentina and other countries.

In addition to consuming electricity, ChatGPT also consumes water.

This is easy to understand, after all, the strong computing power demand of the AI model needs to match the heat dissipation capacity, otherwise the server may be smoking.

ChatGPT consumes more than 500,000 kWh of electricity a day!

Research by the University of California, Riverside shows that ChatGPT consumes 500 milliliters of water for every 25-50 questions it communicates with users.

And ChatGPT has more than 100 million active users, and the water consumption behind this is undoubtedly staggering.

ChatGPT consumes more than 500,000 kWh of electricity a day!

Researchers estimate that by 2027, the global demand for AI could require the consumption of 6.6 billion cubic meters of water, almost the equivalent of the annual water withdrawal of the US state of Washington.

Electricity and water are the big heads, so it seems that the computing power ceiling has to be the level of new energy development, yes, this is indeed true.

U.S.-China energy competition

When it comes to energy development, it is no exaggeration to say that looking at the world, China is now the first in the world except for nuclear power, and all other power generation is the first.

In 2023, China's total electricity generation reached a staggering 8,909.09 billion kWh, an increase of 5.17% year-on-year.

ChatGPT consumes more than 500,000 kWh of electricity a day!

Among them, thermal power generation is still the absolute main force, accounting for 69.95% of the country's power generation.

The rise of renewables is also very strong, with hydropower accounting for 12.8%, wind power accounting for 9.08%, solar energy accounting for 3.3%, and nuclear power accounting for 4.86%.

Comparing the previous data, everyone knows how fast the mainland is developing.

ChatGPT consumes more than 500,000 kWh of electricity a day!

In 2011, thermal power accounted for 82.4% of China's total power generation, 14.1% of hydropower, 1.8% of nuclear power, 1.6% of wind power, and 0.1% <of solar power.

It can be seen that the proportion of thermal power is declining very obviously, while the proportion of clean energy is rising.

In terms of installed capacity, the installed power generation capacity of the country will be 291965 10,000 kilowatts by the end of 2023, a year-on-year increase of 13.9%.

ChatGPT consumes more than 500,000 kWh of electricity a day!

Among them, the installed capacity of thermal power is 139032 10,000 kilowatts, the installed capacity of hydropower is 421.54 million kilowatts, the installed capacity of nuclear power is 56.91 million kilowatts, the installed capacity of grid-connected wind power is 441.34 million kilowatts, and the installed capacity of grid-connected solar power generation is 609.49 million kilowatts.

As for the United States, the United States will generate 4,178.171 billion kilowatt hours of electricity in 2023, a year-on-year decrease of 4.2%.

Installed capacity, in 2023, the net summer installed capacity of power generation equipment in the United States will be 1189,402,900 kilowatts, a year-on-year increase of 2.41%.

In terms of power generation and installed capacity, the United States is less than half that of China.

ChatGPT consumes more than 500,000 kWh of electricity a day!

As carbon emissions become more and more stringent in the future, the proportion of new energy will undoubtedly rise, so the development of AI is likely to rely on them to sustain it.

In addition, AI will also improve the efficiency of power systems in a variety of ways, thereby reducing unnecessary energy consumption and carbon emissions.

For example, AI can analyze patterns in power demand and predict peak usage hours, thereby helping grid operators optimize power supply through demand-side management and reduce the burden on the grid during peak periods.

ChatGPT consumes more than 500,000 kWh of electricity a day!

In addition, AI can effectively integrate renewable energy sources such as solar and wind power, optimizing the use of these resources by predicting energy production and demand, and reducing dependence on fossil fuels.

That's why countries around the world are sparing no effort to develop AI.

By the way, China's "East and West Counting", I don't know if you have heard of it?

It is reported that the first 400G all-optical inter-provincial backbone network of the project has been officially put into commercial use, which will greatly improve the data transmission efficiency between the eight hubs.

ChatGPT consumes more than 500,000 kWh of electricity a day!

In terms of speed, compared with the previous generation of trunk networks, the transmission bandwidth of "Eastern Data and Western Computing" has been increased by 4 times, and the amount of data that originally took 10 minutes to transmit only takes more than 2 minutes based on 400G technology.

The so-called "Eastern Data and Western Computing" refers to the construction of a new computing network system that integrates data centers, cloud computing, and big data, guiding the demand for computing power from the east to the west in an orderly manner, optimizing the construction layout of data centers, and promoting the coordination and linkage between the east and the west.

To put it bluntly, it is to concentrate on using cheap electricity in the west to serve the computing power applications in the east.

ChatGPT consumes more than 500,000 kWh of electricity a day!

As we all know, the western region is vast and sparsely populated, especially Ningxia, Qinghai, Inner Mongolia, Xinjiang and other places, which have abundant solar energy resources, providing unique conditions for solar power generation.

Coupled with China's unique UHV technology, it can quickly send electricity generated from the west to the east.

ChatGPT consumes more than 500,000 kWh of electricity a day!

Not only is it fast, but it also has less wear and tear, and the key is that it is cheap.

Finally, the mainland's energy storage technology has been further improved, which has broken through the biggest bottleneck in AI development.

end

Therefore, don't look at the development of ChatGPT in the United States is in full swing now, the further they go, the more they will eventually encounter a ceiling.

And our AI infrastructure is well prepared.

So, the second half of AI has just begun, who will reach the end in the end, what do you think?

Read on