laitimes

Cold knowledge: Artificial intelligence not only consumes electricity, but also water

AI models can and should solve their own water footprint.

The next time you start a conversation with OpenAI's ChatGPT, think about the bottle of water you need to interact. Considering that more than 100 million people use chatbots every month, that number doesn't add up to that.

Training natural language models, such as the popular chatbot ChatGPT, created by Microsoft-backed OpenAI, requires water to cool the data center servers running these programs. The same is true of Bard, developed by Alphabet's Google.

Cooling towers that use large amounts of water are the most efficient and widely used cooling method in many areas, although cooling with air, refrigerant or a combination is also an option. Servers also consume a lot of electricity – another way to use water. While solar and wind power do not consume water, coal, natural gas and nuclear power all consume large amounts of water.

Cold knowledge: Artificial intelligence not only consumes electricity, but also water

Researchers from the University of California, Riverside and the University of Texas at Arlington said in a paper awaiting peer review that the "huge water footprint" of AI models has been "hidden under the radar" and "this is extremely worrisome because freshwater scarcity has become one of the most pressing challenges we all share." ”

The western states are grappling with water shortages caused by a 20-year-long "megadrought," made worse by extreme heat, aging infrastructure, and chronic overuse of agriculture, manufacturing, population, and cities. 

Earlier this month, the Biden administration introduced a proposal that would allow the federal government to cut water withdrawals from the shrinking Colorado River in Arizona, California and Nevada.

"People need to realize that using machine learning models and training machine learning models doesn't require water," Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside and co-author of the paper, told Barron's. ” 

The authors write that a simple conversation with ChatGPT consumes 500 milliliters (about 17 fluid ounces) of water. They warn that these numbers "could increase several times more, with the new GPT-4 having a significantly larger model size." ”

The researchers point out that the amount of water used depends on when and where ChatGPT is used: During hotter times of the day, more water is needed to cool the data center, and water consumption is higher. Training GPT-3 in Microsoft's US data center can consume 700,000 liters, or 184,920 gallons of clean fresh water, in 15 days.

The report said the 700,000 liters of water would be enough to produce 370 BMW cars or 320 Tesla electric vehicles.

Ren says the number is quite conservative because the authors don't know when or where GPT-3 was trained. He said that if the GPT-3 had been trained in hot areas for summer, the figure could be several times more than 700,000 liters.

"To address global water challenges, AI models can and should be socially responsible and lead by example to address their own water footprint," the researchers said.

James Rees, chief impact officer at Botanical Water Technologies, said tech companies "are becoming more proactive, and they're doing the right thing when it comes to water management." However, (artificial intelligence) technology and its applications are rapidly evolving. ” 

He described the technology as having a "double impact" in terms of water consumption and broader environmental impact. In most cases, AI tools are used for technology development and can be used to find better ways to reduce carbon emissions, or save energy and water, he said. 

Big companies, including Microsoft, Facebook and Google on the META platform, have pledged to achieve "water positive" by 2030, replenishing more water than they use in their direct operations. 

OpenAI did not respond to a request for comment. 

A Microsoft spokesperson said in an email to Barron's following the statement that the company is addressing water use in two ways: reducing water intensity (i.e., the amount of water consumed per megawatt of energy used for operations), and replenishing supplies in water-scarce areas where it operates. 

"As part of our commitment to creating a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI, while working to improve the efficiency of large-scale systems in training and application," the spokesperson said.

Google echoed Barron's in a blog post describing what Google calls a "climate-conscious approach" to cooling its data centers and demonstrating its commitment to "advocating for responsible water use." 

According to the blog post, in 2021, Google's data centers consumed an average of 450,000 gallons of water per day, roughly equivalent to the amount of water needed to irrigate 17 acres of lawn grass at once. This is also equivalent to the water needed to produce 160 pairs of jeans, including the cotton needed to grow them. 

Google said that as climate change continues to exacerbate water challenges around the world, the company will continue to be "committed to investing in technologies that reduce energy and water consumption."

Text | Lauren Foster

Editor|Yu Zhou

Copyright Notice:

Original article by Barronschina may not be reproduced without permission. For the English version, see the April 19, 2023 report "AI ChatBots Guzzle Water. How and Why It’s a Problem.”。

(The content of this article is for informational purposes only and the investment advice does not represent the interests of Barron's; The market is risky, and investment should be cautious. )

Read on