laitimes

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Silicon Valley 101

2024-05-15 11:23Posted in Hong Kong, China

I heard that when OpenAI trained GPT-6, it crashed Microsoft's power grid? Guys, have you ever wondered how much power AI will consume when the generative AI arms race continues?

OpenAI trained GPT-3 to consume about 1,300 megawatts of electricity, and if it is used to watch online streaming video, it can play 1,625,000 hours, or 185.5 years.

Let's present it in another way, and the researchers found that using a large model for AI Wensheng diagrams, on average, can fully charge a mobile phone for each power consumption generated by a picture.

As another macro example, in 2024, when we do this video, the power consumption of AI data centers in the United States will account for 2.5% of the total electricity consumption in the United States.

But that's just the beginning. Silicon Valley's tech giants: Nvidia, Google, Microsoft, Amazon, Meta, Tesla, Oracle and other giants began to build large-scale data centers, OpenAI directly teamed up with Microsoft to build a $100 billion data center "Stargate".

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

With tens of thousands of GPU graphics card clusters becoming the standard for training generative AI, Silicon Valley has begun to roll up multimodal large models, and the scaling law (the law of scale) is still the panacea, and it is conceivable that power consumption will rise exponentially.

Welcome to Silicon Valley 101, and in this issue, we will talk about the energy challenges that AI development will bring. First, let's answer the question of why training a large model consumes so much power.

01 Why is generative AI so power-hungry?

1961年,为IBM效力的物理学家Rolf Landauer提出了Landauer's Principle(兰道尔原理)。

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

It is pointed out that when the information stored in the computer changes irreversibly, the entropy of the system will increase, and it will be accompanied by the dissipation of energy. To put it simply, there is an energy cost to processing information.

1.1 AI training and inference: Dealing with the cost of information energy

Ever since generative AI was established to use the Transformer architecture and follow the "scaling law" with a large number of parameters, large AI models and "massive computation" have been bound together. This means that both "Training" and "Inference" in the operation of large models involve a lot of computation and information processing, or in other words, huge energy costs.

In the former, in the training stage, the AI large model needs to collect and preprocess a large amount of text data, and then initialize the parameters, process the data, generate outputs, adjust, optimize, etc., and with the iteration of the model, the parameters that need to be processed are exponentially increased: GPT3 is 175 billion parameters, GPT4 is 1.8 trillion, GPT5 may break through 10 trillion parameters, and GPT6, which is rumored to be being trained, may be on the order of hundreds of billions or even quadrillions.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Kyle Corbitt, the former director of YC, a top incubator in Silicon Valley, broke the news on Twitter that he was chatting with a Microsoft engineer who told him that GPT-6 training had overloaded Microsoft's power grid, so it was impossible to deploy more than 100,000 H100 GPUs in the same state.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

We'll explain why the power grid crashes later, but what I want to tell you here is how terrible the power consumption of training GPT-6 is. After the training is completed, "inference" also requires very large computing power and power support.

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

My understanding is that we are still in the stage of AI training large models, after these models are trained, its subsequent applications, its inference applications, etc., that is the place where the largest energy consumption, and the power consumption may be much larger than the power consumption of the AI model you trained in the past few months.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

As we know, Transformer is an autoregressive model, which means that multiple rounds of repeated computation are involved in the inference process; In the later generation stage, each token generated needs to interact with the video memory.

As we said at the beginning, the average power consumption of an AI Wensheng diagram is the amount of power that can fully charge a mobile phone. The chat app ChatGPT responds to about 200 million requests a day and consumes more than 500,000 kilowatt-hours of electricity, which is equivalent to the electricity consumption of an average 17,000 American households in a day.

Therefore, whether it is in the training or inference phase, the larger the number of parameters of the model, the more data needs to be processed, the more computation is required, the more energy is consumed, and the more heat is released. This, in turn, requires more powerful chips, and the quest for this is endless.

John Yue

Inference.ai Founder and CEO

Personally, I feel that his requirements for this chip should be endless, for example, if I train a thing for 6 months, then my competitors may say OK, so I should buy a few more GPUs, right? I'm three months, then he's three months, I'm going to have two months, I'm going to have two months, he's going to have a month, and there's no end to that, because everyone always wants to be faster.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Faster, bigger, stronger.

This puts forward higher requirements for AI chips. In order to support such a huge amount of computing, technology giants have built their own data centers (Data Centers), interconnecting tens of thousands of GPUs to support AI computing power.

If the energy generated by AI training and inference is the tip of the iceberg, then the power consumption of the data center itself is the huge iceberg buried in the ocean.

Taking it a step further, the greater energy consumption also comes from the current on the chip and the supporting facilities of the entire data center.

1.2 万卡Data Center:焦耳定律和冷却系统的吞电狂魔

We all know that AI computing power relies on the parallel computing of GPU chips. In every chip, there are now transistors in billions, for example, Nvidia's recently announced Blackwell architecture GPU has 208 billion transistors. When these transistors are operating, they generate an electric current. Recall the Joule law of physics, where the heat generated by the current through these transistors is proportional to the power of the current, to the resistance of the conductor, and to the time of on-electricity (formula: Q=I²Rt).

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Therefore, the power consumption and heat generated by the training and inference of trillion-parameter AI large models running on hundreds of billions of transistors on tens of thousands of GPU chips can be imagined.

In addition to the energy consumption on the chip itself, the data center also involves a lot of energy consumption in the cooling system. When it comes to data center energy consumption, there is an evaluation metric called "Power Usage Effectiveness", or PUE, which is the ratio of all energy consumed divided by the energy consumption of IT equipment. The closer the PUE is to 1, the less energy the data center wastes.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

According to a report by the Uptime Institute, a data center standards organization, the average PUE of the world's large data centers in 2020 was about 1.59. In other words, for every 1 kilowatt-hour of electricity consumed by IT equipment in a data center, 0.59 kilowatt-hours of electricity will be consumed by supporting equipment. Most of this energy consumption is used in cooling systems, which in many data centers can account for up to 40% of total energy consumption.

Therefore, in recent years, with the take-off of the generative AI track, technology giants have quickly enclosed land to build new AI data centers. The giants don't care about the price of electricity, and "where is the electricity" has become a question they care about.

John Yue

Inference.ai Founder and CEO

It's because when we designed the Data Center, we didn't actually consider that the data center needed so much electricity, it was all about my bandwidth or something, and it would be built closer to this ISP (Network Service Provider), so as to ensure that it has an advantage in bandwidth. But now we find that we actually need to be closer to electricity, not to bandwidth, or if you want to build this, this kind of accelerate compute (accelerated computing) data center, like 32,000 GPUs, then the bandwidth requirements are far less than the requirements for electricity.

Chen Qian: So it's built in a place where electricity is cheaper?

John Yue

Inference.ai Founder and CEO

No, it's no longer time to consider whether electricity is cheap or not? There is no electricity right now. Well, now you're looking at the Data Center layer, and what everyone is doing is shopping for power. That is, if you open a large power station, someone will immediately rush to build a data center (data center) on that piece of land.

According to a recent research report released by Bank of America to institutional clients, the energy consumption of data centers worldwide will soar at a compound annual growth rate of 25 to 33 percent between 2023 and 2028.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

AI is actually very important to a country's economy, for example, a very rough estimate, that is, the load of this data center for each megawatt of AI can bring about 10 million US dollars in annual revenue, and if it is one megawatt hour, its cost may only be about 30 US dollars to 50 US dollars. So that's a very high economic benefit. So that's why, no matter how high the price of electricity is, all tech companies are willing to build [data centers] as long as there is electricity.

With such a potentially lucrative and high-return business, how can the giants not bet? According to a report released by the International Energy Agency (IEA), the global electricity consumption of data centers, artificial intelligence, and cryptocurrencies reached 460TWh in 2022, accounting for nearly 2% of global energy consumption. The IEA predicts that in the worst-case scenario, electricity consumption in these areas will reach 1,000TWh by 2026, which is comparable to the electricity consumption of the whole of Japan.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

The current problem is that the demand for electricity is growing rapidly, but the power grid infrastructure in many regions, including the United States, has not been renovated for many years, and it cannot keep up with the pace of AI growth. Next, let's talk about how AI power consumption will cause a global power shortage.

02 Where is the power shortage: old infrastructure and soaring new demand?

In a tweet from Kyle Corbitt, the former director of YC, he said that Microsoft engineers broke the news to him that Microsoft once deployed more than 100,000 H100 GPUs in one state to train GPT-6, overloading Microsoft's power grid and crashing. Why is this problem occurring?

徐熠兴(Ethan):

Senior Project Manager, Energy Strategy, Microsoft

The design of the power grid will basically be designed for your electricity load, that is, the previous data center, it is actually a stable electricity consumption, and its electricity consumption is basically a relatively stable state 24 hours a day. However, whether AI training or reasoning, it will show very different characteristics of electricity consumption, during training, or in application, there will be a very large swing, for example, it may balance 100% of the electricity consumption, suddenly drop to 10% of the electricity consumption, or the next second will rise to 100% of the electricity consumption, it will be within a few seconds, or even within a second, there will be a relatively large swing of this kind of electricity, such a situation, will bring unacceptable use shocks to the power grid. It will have a certain impact on the stability of the power grid.

In fact, data centers have always consumed a lot of power, but with the outbreak of AI, all major giants have launched an "arms race" to deploy AI, so they are building new data centers on a large scale, but the load of data centers is too heavy, and the power generation system cannot provide such high power. This means that the grid infrastructure has not been updated for nearly 20 years.

Over the past 20 years, although the U.S. economy has been improving, due to the concept of "deindustrialization", the overall economic growth has not been correlated with electricity consumption, and the annual growth rate of electricity consumption is only 0.5%, which is very different from the situation in some developing countries in Asia. In the past 20 years, engineers in the United States have not encountered such a large demand for power growth, resulting in the entire power grid planning, there is no plan for this situation, and because the construction capacity is relatively weak, it is impossible to keep up with the development demand in the short term, so in the next three to five years, there may be a shortage of electricity in many parts of the United States.

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

The challenge is equally enormous for policymakers. Because in the United States, you have to build a power grid, you need to build a power station, you need to build a transmission line, and all of that can involve millions of residents. Because a lot of land in the United States is privately owned, which means that if you want to build a power grid and upgrade the power grid, your transmission line will definitely have to pass through a lot of private landowners, so how can you convince them to allow the construction of the power grid and allow the construction of transmission lines, which will be a very big challenge.

Therefore, in a recent article in The New Yorker, the energy demand of AI was described as "obscene", which is very polite. But the giants are not stopping because of the challenges of the power grid, while Microsoft and OpenAI have even invested $100 billion to build the largest AI supercomputer project ever: Stargate.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

03 Hundreds of billions of dollars, gigawatts of Stargate

We will take stock of several giants in Silicon Valley, Meta currently has 650,000 H100, and plans to spend $800 million this year to build an AI data center, Amazon plans to invest $650 million in data centers, and Google is even more generous, investing $1 billion to build data centers, but these are just a fraction of the time in front of Microsoft.

According to the American technology media The Information, OpenAI and Microsoft plan to spend $100 billion to build an AI supercomputer, called "Stargate", which is 100 times higher than other data centers currently in operation. You must know that OpenAI's investment is only $13 billion, which is enough to build 8 OpenAIs.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

The Stargate project is expected to be completed in 2028, and the chips purchased will no longer be H100, but millions of Nvidia's latest B200 chips, and most importantly, the power demand of this project will reach the order of several gigawatts.

Although the Stargate project is still in the early planning stage and has not yet been officially approved, and there may be changes, the introduction of this plan tells us a clear signal: whoever masters the computing power will control the future.

And such a huge demand for electricity will cause an immeasurable gap in the power system of the United States, and you may want to ask: why doesn't Microsoft consider other countries to build Stargate to reduce the power burden of the United States?

Xiang Jiang

CEO of Hanhai Juneng

Data is now an asset, and even a strategic asset. And for the next step in the development of AI, first of all, this is now a shortage of chips, then a shortage of data, and then a shortage of energy, which is already highlighted here, and there is actually a shortage of data now. You say it's unthinkable to build a data center in another country and then train and even use the data from that country.

After talking about this, the power shortage crisis in the United States will continue to expand. So how to solve the problem of energy consumption to ensure that a project of this magnitude like Stargate is solved?

Judging from Stargate's internal discussions revealed by The Information, more efficient data center optimization, as well as alternative energy sources such as nuclear energy, are directions that urgently need technological breakthroughs. Let's start by talking about chip and data center optimization.

04 Data Center Optimization: Chip Efficiency and Liquid Cooling Technology

We talked earlier about the power usage efficiency PUE of a data center, and the closer the PUE is to 1, the more energy efficiency will be, right? So, how do you optimize the PUE of your data center? Eh, Lao Huang gave some feasible answers.

At Nvidia Conference 2024, Huang said the next-generation Blackwell GB200 consumes a quarter of the energy of its predecessor, the Hopper architecture. How does NVIDIA's Blackwell GB200 optimize energy consumption? Let's take a closer look at NVIDIA's animated demonstration.

This is Blackwell's GPU core, and in the animation, the two cores are spliced together to form the core of the B100.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

These six blocks are HBM (High Bandwidth Memory) next to the core. Next to it are 8 memory cards, which is a GPU. These six blocks are HBM (High Bandwidth Memory) next to the core. Next to it are 8 memory cards, which is a GPU.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

This is the addition of their Grace CPU, which is the main brain that drives the two GPUs. This is called GB200 (3:55), and that G is its Grace CPU.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

After the two GB200s are installed in the cabinet, they form a NODE, a computing node. The card added here is Infiniband, and its main function is to enable high-speed communication between various computing units.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

除了Infiniband,NODE还加入了DPU(Data Processing Unit),用来处理数据,减轻CPU负担。

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

NVIDIA WILL COMBINE 18 NODES. This is NVIDIA's NVLink Switch chip, which you can understand as a switch and is used to connect to NODE.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

And add the LAN acceleration card to form the whole unit. By increasing the number of units, it eventually became a data center.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

After explaining this animation on GTC, I believe you also understand what Blackwell and B200 are. At GTC, Lao Huang introduced that the Blackwell GPU has 208 billion transistors, and when used for AI training, it will be 1 times faster than H100 and 5 times faster for inference. On top of that, for the same amount of AI training, GB200 consumes a quarter of the power of the previous one.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

In this way, to a certain extent, with the advent of the B200, the power consumption problem of AI data centers will be alleviated, after all, NVIDIA occupies 95% of the AI market.

In terms of energy consumption, one more important point is important. What makes all of these NVIDIA data centers run smoothly is the "liquid cooling technology" that will soon become the industry standard. Bank of America explained in the research report that going forward, as data center power density increases, traditional air-cooled system methods may no longer be suitable, and liquid cooling solutions are needed to help improve data center performance.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

We interviewed someone at Supermicro at the NVIDIA GTC and they said the same thing: AI data centers after NVIDIA's Blackwell architecture will move to liquid cooling.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

In fact, liquid cooling technology has been developed for quite some time, and is currently divided into two technical directions: liquid cooling and immersion liquid cooling. The technical details aside, but in summary, liquid cooling not only creates an opportunity to reduce data center energy consumption, but also reduces power usage effectiveness (PUE) to an ideal state close to 1.

But note that I'm talking about "ideal" here. What about reality?

John Yue

Inference.ai Founder and CEO

It's that this B100 is much tricky than the previous H100, because there is no standard for this liquid cooling (liquid cooling) on the market now, and many of these data centers or this kind of worry, he actually doesn't dare to touch this liquid cooling privately. Because Nvidia has requirements, because when Nvidia delivers it, it doesn't have liquid cooling in it, so you want to install liquid cooling, and you actually dismantle the set of things that came with it. Then if there is a problem after you install it, Nvidia will not warrant it. So many people don't dare to touch this liquid cooling.

First of all, the biggest problem is production capacity, even if Blackwell comes out, but H100 is still in short supply, want to replace H100 with B100, not to mention whether there are so many cards, when the entire industry lacks computing power, the choice of enterprises is only to increase, not replace.

Secondly, even if you want to exchange the B100 for the H100, there are technical problems. When designing a data center, the entire supporting equipment such as transformers, wires, and heat dissipation must be matched with the chip, whether it is B100, B200 or GB200, its supporting solutions are different from the previous generation, so the existing data center will be difficult to replace directly.

John Yue

Inference.ai Founder and CEO

No one has successfully deployed a B100 outside of Nvidia yet, so it's uncertain how to deploy it. Because it's really different from the H100. The one in the video, open that buckle, H100 out, B100 in, that is very idealistic, in fact, you have to change a lot of things. The B100 may be slightly better, like he said that the B200 everyone should use, I think it will take a long time.

Because the cabinet may have to be rebuilt, it consumes too much power. You're going to have to rebuild your data center, or you're going to have to rearrange your data center because you don't have enough power to cool it. Or if your cabinets are too close and too hot, and then your cooling is not up to the mark, you want to move all these cabinets away, which is very troublesome.

Finally, due to the existence of competition, the demand for computing power of capital will be endless. In other words, the improvement of energy efficiency of chips and data centers is still unable to improve the shortage of computing power compared to the overall demand in the market, and the total energy consumption is still soaring rapidly.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

In the era of rapid PC development, there was Andy Beer's theorem, and no matter how much Intel improves chip performance, this part will soon be eaten up by Microsoft's software needs. In today's age of AI, a similar law may play out again.

徐熠兴(Ethan):

I think there may be a similar situation on GPUs, that is, its energy consumption is reduced a lot, very quickly, but because of the reduction in energy consumption, it may lead to more people, in more applications, will need more GPUs, and finally it will lead to an increase in the overall energy consumption.

So, is there a more powerful solution to help us solve the energy problem?

05 The ultimate solution to energy: nuclear fusion?

In the face of such a large power gap, where should the giants go?

Ethan Xu mentioned an idea, in the short term, distributed energy storage will be a more important solution. For example, the development of photovoltaic charging, so that every household is equipped with solar panels, so as to reduce the dependence of household electricity on the power grid, and use more electricity for the AI industry. After all, as we mentioned earlier, AI is very important to the national economy, so the government has an incentive to move forward with this plan.

At the same time, because the power plant is continuous power generation, but there are always times when the load of the grid is low, the unused electricity will be lost in vain, so the construction of energy storage equipment can also make full use of the power plant's electricity. At present, the mainstream energy storage equipment is batteries, and in some places, pumped storage will be used, that is, when the electricity consumption is low, the water is pumped to a high place, and then released at the peak, and the water is generated through the flow potential of the water.

However, distributed energy storage and new energy power supply can only provide power help in the short term, and it seems that they cannot be used as reliable energy support for long-term AI development and cannot solve long-term needs.

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

One of the main reasons for this situation is that we now rely on a lot of clean energy, such as wind and solar, they are not completely controllable, when there is wind and sun, you have energy, and when there is no wind and sun, you don't have these energy.

To this end, giants like Microsoft are actively cooperating with various power companies in the United States, and even Sam Altman has directly invested in an energy company called HelionEnergy.

Lex Fridman: How do you solve the energy problem? Nuclear fusion?

Sam Altman: That's what I recognize

Lex Fridman: Who's going to solve this problem?

Sam Altman: I think Helion did the best

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

That's right, nuclear energy, this is the direction that the giants firmly believe. Nuclear fusion is to pass deuterium and tritium through a certain means to make it into a plasma state, and after nuclear fusion produces energy, once the external force stops, the plasma state will disappear, and the reaction will end, which is relatively more controllable and safer.

Helion is on the path of nuclear fusion, and they have also signed a VAM agreement with Microsoft, promising to start generating electricity through fusion by 2028 and provide Microsoft with a target of at least 50 megawatts of electricity at $0.01 per kilowatt-hour a year later, or pay a fine. This aggressive VAM is considered the first commercial agreement in the field of nuclear fusion power generation.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Xiang Jiang

CEO of Hanhai Juneng

Its confidence lies in the technical route it now adopts, and the cost of the construction of the device is very low. If a tokamak is built with tens of billions or four or five billions, then its technological iteration cycle will reach more than 10 years.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Xiang Jiang

CEO of Hanhai Juneng

It adopts such a technical route as a linear field inverse, which greatly reduces its capital threshold.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

So what is the current level of nuclear fusion?

Xiang Jiang

CEO of Hanhai Juneng

In fact, in terms of the current level of technological development, nuclear fusion can be used to generate electricity, but we are still in the stage of experiments, and we are now using deuterium-deuterium to carry out experiments, and if we want to use such power generation efficiency of nuclear power plants, at least deuterium-tritium fusion is used to generate electricity, and it is now called priceless, and each gram of tritium is about two or three million yuan such a price.

If you want to use nuclear fusion to generate electricity, you have to solve the heat conduction, turbine motors, power grid and other facilities, and the investment in this is billions, so many voices in the industry are actually very skeptical that Helion will start to power Microsoft with nuclear fusion in 2029. But in fact, Microsoft itself is not blindly optimistic about the arrival of nuclear fusion technology.

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

In fact, the main purpose of Microsoft's investment in this company or signing this contract is to give them a strong signal on the demand side in the early stage, and to support such innovative companies in this way, to help them, and to reduce the risks they face.

Microsoft and several large companies, including Amazon, including Google, etc., have actually been using their own investment departments to invest in different new technologies, including nuclear energy, including nuclear fusion. Their hope is also that by investing in these technologies, these technology companies can develop better, be able to achieve scale at a faster speed and lower cost, and be able to achieve this nuclear fusion as much as possible.

While it's still a huge unknown when nuclear fusion will arrive, it's clear that nuclear energy will be the next market for the giants to target. Recently, Amazon purchased a Pennsylvania data center location with a supply of nuclear energy. According to two people involved in the talks, Microsoft has also discussed bidding for the same location. So, next, the location of data centers with nuclear power supply may be the next battle ground for the tech giants.

At the end of the article, let's talk about a more realistic question: what about the carbon neutrality goals they have promised by the tech giants in Silicon Valley?

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

06 More expensive and more difficult carbon neutrality goals

As we all know, the world is now promoting the goal of carbon neutrality, but with the huge power consumption of AI, the difficulty and cost of achieving carbon neutrality may double.

The current arms race for generative AI has undoubtedly disrupted the carbon emission plans of many tech giants, because the development of AI scaling law is too energy-intensive, and it can be said that it is a very high-carbon economic activity.

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

The energy transition was started much earlier, so many big companies, including Microsoft, Google, Amazon, Meta, etc., made this promise to the public before the advent of AI, and they didn't think about AI at the time.

Microsoft has pledged to achieve 100% carbon emissions-free clean energy use and carbon neutrality by 2030, Amazon has pledged to achieve carbon neutrality by 2040, and Google and Meta have pledged to achieve carbon neutrality across their entire operations and value chain by 2030, but these commitments seem to be more difficult to meet because of this round of AI.

徐熠兴(Ethan)

Senior Project Manager, Energy Strategy, Microsoft

When these companies made these commitments, they were already ambitious enough and difficult enough that the cost of achieving the energy transition would be high before AI was included, and the cost could have doubled if AI was added. When the energy transition reaches the last 5% or 10%, and when you want to achieve 95% or even 99% clean energy, this cost will increase almost exponentially.

In addition to electricity, water energy is facing similar challenges. In recent years, leading technology companies in the field of AI large models have also faced a significant increase in water consumption, with data showing that the AI chatbot ChatGPT "swallows" 500 milliliters of water for every 10 to 50 conversation prompts.

In June 2023, Microsoft released its 2022 annual environmental sustainability report, in which water use has increased by more than 34%.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Google Inc. was similar, even sparking an attempt to block Google's plans to build a data center in the Uruguayan capital in mid-2023 amid protests in the South American country's worst drought in 74 years.

The liquid cooling technology we mentioned in the article, and the further popularization and application of the system, will continue to have a continuous demand for water resources.

The choice between technological development and energy consumption is very dilemma. On the one hand, it is necessary to fight the battle of AI technology and business with stable and large amounts of electricity, and on the other hand, it is necessary to fulfill environmental commitments to society to fight the battle of carbon neutrality, which is expensive and difficult. There is a popular saying in the industry: the end of AGI is energy. If controlled nuclear fusion technology is not possible before humanity reaches AGI, how far can existing energy solutions take us? This is a huge uncertainty.

At the same time, we should not forget the social costs as tech companies calculate the cost of AI. The Intergovernmental Panel on Climate Change (IPCC) reports that if we fail to effectively control global temperature rise within this century, climate change will cross tipping points and lead to more frequent extreme weather events. Research by the Climate Policy Initiative estimates that cumulative losses from climate risks could be in the hundreds of billions of dollars by 2100.

Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

If the development of AI will drive up carbon emissions, delay carbon neutrality, and lead to more climate losses, then this will also be factored into the cost of AI. At that time, will mankind's economic account for the development of AI technology still be calculated?

View original image 29K

  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants
  • Grabbing electricity, enclosing land, gambling, and talking deeply about the 100 billion dollar AI energy war of technology giants

Read on