/

Evaluating Generative AI’s Environmental Impact

8 mins read
Source: Google

In November 2022, the California startup OpenAI launched ChatGPT, the online service that made its GPT-3 chatbot accessible to nontechnical users for the first time. ChatGPT quickly became the fastest-growing consumer service in history, racking up 100 million users in just two months. OpenAI is now worth around $150 billion. Other companies have also benefited. Nvidia, a maker of AI computer chips, has quintupled in value over the past year, while Microsoft, which partnered with OpenAI to bring chatbot technology into its products, recently overtook Apple to become the world’s most valuable company. 

Today, it is difficult to find any domain — from the classroom to Silicon Valley — in which generative AI technology does not play a part. Yet, what users do not often consider when taking advantage of the productivity boons associated with AI is the extent to which digital technology is made manifest in the physical world. Indeed, the growing uptake of generative AI is contributing significantly to U.S. and global energy consumption, water usage, and climate footprints. And although none of those factors are likely to significantly slow the growth of generative AI as a sector, perhaps they will give you pause the next time you ask ChatGPT to whip up a baking recipe, suggest travel destinations or help you understand a difficult passage in your English reading.

Generative AI technology is built on a class of computer chips called GPUs. Originally designed to provide detailed pictures for video game enthusiasts, GPUs have proven adept at crunching numbers for AI companies. ChatGPT, and related technologies, are built on a vast corpus of data — they have read, watched, and listened to much of the Internet. The sum of these trillions of snippets of information is used to create a probabilistic model of language, visual art, or even music and video. As a result, AI can complete phrases and write paragraphs based purely on the probability of certain words, or sequences of words, appearing in relation to one another within the context provided by a prompt. This information is “learned” on GPUs — and when you type a prompt into ChatGPT, a GPU will consider about two trillion inputs before deciding which words it will spit out. 

GPUs require enormous amounts of energy to cool and run: their impact on the environment is significant and fast-growing. In the case of AI models, it begins far before a product like ChatGPT ever becomes available to the public. GPT-4, an advanced model for which OpenAI clients shell out twenty dollars per month, was trained in a Microsoft computing center near Des Moines, Iowa. During the month preceding GPT-4’s release, the model consumed 6% of Des Moines’ water supply across its 295,000 computer chips. Data centers in warmer climes require even more cooling: data centers in Northern Virginia and Las Vegas, which together account for over 80% of the U.S. capacity, are perhaps twice as thirsty as those in Iowa. Microsoft has been slapped on the wrist by the local water authority, which says that it won’t approve more data centers until the company cuts down its water usage. The overall trend, however, points to a vast increase in water consumption driven by AI. Google’s water footprint spiked by 20% last year, and Microsoft’s jumped by 34%. Together they use as much water as one million American households. 

Implementing AI models is even more taxing. Depending on a user’s location and the time of year, a chatbot may draw one liter of water for every 10-100 queries. Plus, water used for cooling data centers isn’t generally reusable – it is considered more economical and energy-efficient to throw out used water. 

Overall energy consumption associated with AI models is also high – indeed, demand has risen so high as to compel governments to withhold power from new data centers. In northern Virginia, (“Silicon Alley”) the local utility suspended new power hookups for several months last year as it grappled with the technical implications of vast concentrated power use. Ireland — 20% of whose power goes to data centers — has gone further, banning new connections to computing facilities until 2028. Microsoft is avoiding the problem by building a stand-alone power plant for its new data center near Dublin. 

Tech companies may be sapping the nation’s water and energy reserves, but there is no question of generative AI development being halted because of environmental concerns. Instead, tech titans are attempting — whether out of the goodness of their hearts or a fear of public rebuke — to mitigate the effects associated with the general-purpose technology that generative AI has become. Microsoft and Google point out that they are among the largest investors in green energy anywhere. The former describes plans to use nuclear power in its data centers by 2028: both are investing in wind and solar power. Their technical prowess may also pose solutions. Paradoxically, the most effective solution to AI’s environmental woes will likely come with more AI: Google has trialed a system that uses a machine-learning model to redirect non-urgent computing requests away from local data centers and towards ones with lower carbon footprints. Google, Meta and Apple have pledged to achieve zero-carbon by 2030, and Microsoft and Amazon by 2035 and 2040 respectively. Consumers who ask for environmental responsibility will increasingly demand that tech giants cut back on their ecologically destructive megalomania. History also shows that computing processes will become less energy-intensive with time: Koomey’s law holds that energy consumption will halve every thirty months, assuming companies use the very latest chips. 

Yet, tech giants’ promises of more sustainable future initiatives do nothing to insulate the world from their present thirst for energy and water. The fact remains that, for the moment, AI tech is wreaking havoc on the nation’s power and water networks. So, the next time Shakespeare leaves you scratching your head, consider opening a real-life conversation rather than the ChatGPT app – and save the West Des Moines Water Authority an angry email to OpenAI. 

Leave a Reply

Your email address will not be published.

Latest from Blog