THE ENERGY-HUNGRY AI REVOLUTION: A RACE FOR SUSTAINABLE SOLUTIONS
Part I. AI: A Key Tool for Combating Climate Change and Increasing Resource Efficiency
Part II: https://cleanenergyrevolution.co/2024/12/31/the-energy-hungry-ai-revolution-part-ii/
The AI revolution brings incredible opportunities, but also a growing energy and water demand. Explore how we can innovate and collaborate to power AI sustainably for a greener future.
It can be expressed in a trilemma under an Euler diagram where on the one hand there is a high demand for energy (green but of all kinds, including nuclear as an intermediate between fossil energy and green energy) to supply the industrial sector of chips and information with data centers of Artificial Intelligence, on the other hand, precisely not to stop the revolutionary advance of the knowledge and technology economy that acts as the main driver for a radical transformation of epochal change, and finally, the fulfillment by the private sector of what is required by supranational organizations with sustainability compliance goals and zero emissions, but on the other hand, those same states that make up and dictate the compliance deadlines do not supply fast enough neither with the incentives nor with the conventional or green infrastructure to accompany and not stop the tangible and intangible machinery of progress.
A major consideration, both currently and looking ahead to 2030 and beyond, is the escalating energy consumption driven by AI applications. The rapid progress in both hardware and software has given rise to large language models (LLMs) that rival human proficiency across a wide array of valuable tasks, significantly increasing power demands. Growing data center load exacerbates on-going supply chain concerns for electrical equipment (e.g., transformers, switching equipment, generation equipment, advanced transmission technologies) both in the near- and longer-term).
The tech industry’s soaring energy needs are clashing with the urgent need to reduce greenhouse gas emissions. This challenge is particularly acute in Asia, where countries are eager to attract tech investment but struggle to provide enough clean energy.
This makes it difficult for companies to meet emissions reduction targets. The situation is similar in other Asian economies, where demand for electricity is outstripping supply, even from non-renewable sources.
The boom in AI data centers is further exacerbating the problem, as these facilities consume massive amounts of power. Even the manufacturing process for AI servers is far more energy-intensive than for traditional servers. Apple, Microsoft, Google and Samsung have joined RE100, a corporate initiative to commit to 100% use of renewable energy, and they are asking suppliers to quickly and aggressively cut emissions.
The lack of reliable infrastructure is another obstacle. Tech suppliers report that power outages and unstable electricity supplies are impacting their operations. This makes it even harder to transition to renewable energy sources. The number of data centers in the region has at least doubled over the past two years. The target of this new investment is also shifting from traditional servers toward AI servers, which will make up about 37% of all data centers by the end of the decade up from around 16% last year.
In line with this trend, data center energy demand in the Asia-Pacific is forecast to rise more than 300% between 2024 and 2030, reaching as high as 6.5 gigawatts. One gigawatt is about as much electricity as a conventional nuclear reactor produces. The lead time for constructing and bringing a large data center online is around two to three years, while adding new electric infrastructure (generation, transmission, substations) can take four or many more years.
The challenge is clear: how to reconcile the tech industry’s massive energy demands with the urgent need to cut greenhouse gas emissions.
BEYOND CARBON: THE URGENT NEED TO ADDRESS AI’S WATER FOOTPRINT
Data centers are notorious for their significant scope-2 carbon footprint due to electricity consumption. However, their immense water usage is often overlooked. Beyond the “hidden” water used in their supply chains (scope-3), data centers consume vast amounts of water for both on-site cooling and off-site power generation (scope-1 and 2, respectively).
Even excluding third-party facilities, Google’s own data centers directly withdrew 25 billion liters and consumed nearly 20 billion liters of water for on-site cooling in 2022, primarily potable water. Alarmingly, Google’s overall water usage surged by 20% compared to the previous year, with Microsoft’s increasing even more drastically at 34%. The rising demand for AI is likely a contributing factor to this escalating water consumption.
This underscores the urgent need to address the substantial water footprint of data centers, especially as AI continues to expand.
How Does AI Use Water? Scope-1 Water Usage. Many data centers, including Google’s, utilize cooling towers to dissipate heat. In suitable climates, they might employ “free cooling” with outside air, potentially supplemented by water, to directly cool servers, eliminating the need for cooling towers.
Scope-2 Water Usage. Similar to their responsibility for scope-2 carbon emissions, data centers, including those running AI workloads, also contribute to off-site water consumption due to the electricity they use. The amount of water used varies depending on the type of power plant generating the electricity (e.g., coal and natural gas).
Scope-3 Water Usage. AI chip and server manufacturing uses a huge amount of water. For example, ultrapure water is needed for wafer fabrication, and clean water is also needed for keeping semiconductor plants cool. Unlike scope-1 and scope-2 water usage, the data for scope-3 water usage (including withdrawal and consumption) remains largely obscure.
With high-density server racks requiring water cooling, reducing data centers’ water footprint is essential. Studies show that data centers consume significant freshwater, with Water Usage Effectiveness (WUE) increasing drastically during Machine Learning model training. This poses another trilemma: balancing energy needs, carbon footprint, and water consumption.
A potential solution involves integrating an aero gas turbine into a hybrid microgrid. This not only reduces carbon emissions but also enables heat recovery from the turbine’s exhaust. This recovered heat can power an absorption chiller, generating and storing chilled water for indirect server cooling, addressing the concerns of data centers that prefer to avoid direct contact between servers and cooling mediums.
For instance, a 300 MW data center might have a 15 MW heat load, requiring about 4265 Refrigeration Tons (RT). A single turbine could provide over double this cooling capacity while achieving an overall plant efficiency of up to 80%. Other options like combined cycle and steam generation could be explored depending on specific requirements. This integrated solution benefits data centers, particularly those focused on AI and ML, by enabling sustainable growth while addressing key resource constraints.
Furthermore, we advocate for greater transparency concerning the water footprint of AI models. Achieving truly sustainable AI necessitates a comprehensive approach that addresses both water and carbon footprints in tandem.
The second and final part of this article will be coming soon.
Diego Balverde
Economist
European Central Bank
Federico Weinhold
Specialist Researcher