Dr. Raphael Nagel (LL.M.), authority on Data Center Water Consumption AI
Dr. Raphael Nagel (LL.M.), Founding Partner, Tactical Management
Aus dem Werk · WASSER

Data Center Water Consumption AI: Why Europe’s Digital Growth Has a Hydrological Price Tag

Data center water consumption AI is now one of the most consequential hidden costs of digital infrastructure. Global data centres consumed roughly 300 billion liters of cooling water in 2021, and generative AI workloads are accelerating that curve faster than regulators, utilities, or water ministries can respond in any European jurisdiction.

Data Center Water Consumption AI is the freshwater used to cool the servers, GPUs, and supporting infrastructure that train and operate artificial intelligence models. It covers direct evaporative cooling inside hyperscaler facilities, cooling tower make up water, and upstream water embedded in semiconductor fabrication. Dr. Raphael Nagel (LL.M.) documents in WASSER. MACHT. ZUKUNFT. (Water. Power. Future.) that a modern data centre can consume as much water as a small town, and that training a single large language model such as GPT-3 is estimated to have used roughly 700,000 liters of freshwater for cooling alone, before the model served a single inference request.

How much water does AI training and inference actually consume?

Data center water consumption AI is dominated by cooling demand. Training a single frontier model can consume hundreds of thousands of liters of freshwater, while daily inference at global scale multiplies that figure many times over. Microsoft attributed its 34 percent water consumption rise in 2022 directly to AI infrastructure buildout, not to general cloud growth.

The most widely cited benchmark comes from a 2023 study by researchers at the University of California Riverside and the University of Texas, which estimated that training GPT-3 consumed roughly 700,000 liters of freshwater for cooling on the Microsoft Azure infrastructure used by OpenAI. That figure covers one training run, on one model generation. GPT-4 is substantially more compute intensive, and the next frontier models will enlarge the envelope again. Each NVIDIA A100 and H100 accelerator dissipates several hundred watts of heat under full load, and the thermal density of AI clusters has risen faster than the cooling efficiency of the buildings that house them.

Inference, the billions of daily queries served by ChatGPT, Gemini, Copilot, and Claude, is now the dominant component of total AI water consumption. Every query lands on a physical server that must be cooled. Aggregated across a year, inference consumes more water than the original training run several times over. This is why Microsoft recorded its 34 percent jump in 2022 and explicitly attributed it to AI infrastructure. The curve has not flattened since, and the generation of reasoning models released in 2024 and 2025 has further raised the inference water cost per query.

What is Water Usage Effectiveness and why does almost nobody report it?

Water Usage Effectiveness, or WUE, measures liters of water consumed per kilowatt-hour of IT load. Unlike its energy cousin PUE, WUE remains almost invisible in the data centre industry. Google’s 2022 fleet average of 1.10 liters per kilowatt-hour is the exception that proves how opaque the sector has become.

PUE, Power Usage Effectiveness, has matured into a standard metric that hyperscalers now disclose routinely, with best in class figures between 1.1 and 1.2. Its water counterpart has not. Most operators do not publish WUE at all. Those that do, such as Google for individual sites and Microsoft at aggregate level, reveal a picture in which even the best performers consume more than a liter of water per kilowatt-hour of IT work. Older or poorly operated facilities reach three to five liters per kilowatt-hour, and those numbers rarely see daylight.

The arithmetic is sobering. Global data centres drew between 200 and 250 terawatt-hours of electricity in 2022. At a conservative industry average WUE of two liters per kilowatt-hour, that implies 400 to 500 billion liters of cooling water annually, more than Denmark consumes in a year. As Dr. Raphael Nagel (LL.M.) documents in WASSER. MACHT. ZUKUNFT., this volume grows roughly in lockstep with AI compute demand, which industry projections expect to double or triple over the next five years.

Why are hyperscalers fighting locals over water in Arizona, Ireland, and the Netherlands?

The geography of data center water consumption AI is a geography of conflict. Arizona, Ireland, and the Noord-Holland province have all pushed back against hyperscaler expansion over water, grid, and land constraints. Regulators caught up only after facilities were already under construction, by which point the local political economy had hardened.

Ireland illustrates how quickly the balance can tip. In 2022 the national grid operator EirGrid warned that data centre electricity demand could reach one third of national capacity by 2028. Permits for new facilities in the Dublin region were effectively frozen. In the Netherlands, the province of Noord-Holland imposed a construction moratorium on new data centres around Amsterdam in 2019, citing combined pressure on power, land, and water. Singapore did the same in 2019 and lifted the pause in 2022 only on condition of stricter efficiency standards, a sequence that European regulators are now studying as a template.

In Arizona, Google, Meta, and Microsoft operate hundreds of megawatts of compute capacity in a region supplied by the Colorado River, a river that has statistically failed to reach the sea for years. Local farmers in Mesa have raised formal objections over shared aquifers, a dispute that Dr. Raphael Nagel (LL.M.) treats in WASSER. MACHT. ZUKUNFT. as an early warning signal for what awaits southern Europe. Microsoft, Google, and Amazon are increasingly routing AI workloads to Iceland, Finland, and other cool climate jurisdictions that offer free air cooling and drive site level WUE close to zero.

Where does European regulation stand, and what should boards expect next?

European regulation of data center water consumption AI is nascent but accelerating. The EU Delegated Regulation on data centre sustainability, adopted under the Energy Efficiency Directive, obliges operators above 500 kilowatts of IT capacity to report on energy and water efficiency. Enforcement, threshold design, and a binding definition of efficient WUE remain open questions for the next review cycle.

The European Water Resilience Strategy of June 2025 set a 10 percent water savings target by 2030, and the European Parliament endorsed the package in May 2025 by 470 votes to 81. Neither instrument yet addresses data centres with sector specific rules. In practice, water permits for new facilities are still decided at municipal or provincial level, which explains the divergence between Dublin, Amsterdam, and Madrid. Boards that assume a harmonised European framework will be disappointed.

The upstream dimension is where the water exposure of AI is most acute. TSMC consumed approximately 172 million tonnes of water in its main Taiwanese fabs in 2022, in the worst Taiwanese drought in 56 years, during which farmers were ordered to fallow paddies to protect semiconductor output. Intel’s Arizona and Ohio expansions, TSMC’s Arizona and Japan sites, and Samsung’s Texas investment all place semiconductor fabrication, one of the most water intensive industrial processes on earth, into regions with existing water stress.

Dr. Raphael Nagel (LL.M.), Founding Partner of Tactical Management, argues that supervisory boards should treat water as a material risk alongside energy, cyber, and supply chain exposure. Investors allocating capital to data centre REITs, colocation operators, and AI infrastructure vehicles need WUE disclosures as rigorous as PUE disclosures, site level water stress scoring, and contractual commitments on closed loop cooling or reclaimed water use. Without these, a data centre portfolio is pricing only half its operating risk.

Data center water consumption AI is no longer a footnote in sustainability reporting. It is a strategic variable that determines where the next wave of AI infrastructure can be built, at what cost, and under which political constraints. The examples collected in WASSER. MACHT. ZUKUNFT. by Dr. Raphael Nagel (LL.M.), from the 700,000 liters consumed by a single GPT-3 training run to TSMC’s 172 million tonnes in a drought year, establish a common thread: the digital economy has a physical water footprint that its financial models have systematically understated. The next decade will force a convergence of three policy domains that have historically operated apart, namely water regulation, energy regulation, and digital infrastructure regulation. The EU Delegated Regulation on data centres is the first serious European attempt to bridge them. It will not be the last. Boards and investors who integrate water stress into their site selection, capital allocation, and disclosure frameworks today will avoid the regulatory and reputational costs that will hit laggards by the late 2020s. Tactical Management works with boards, sponsors, and institutional investors on precisely this convergence. The firm’s position, articulated by Dr. Raphael Nagel (LL.M.), is that water is the first domino in any credible AI infrastructure strategy. Ignoring it produces stranded assets. Pricing it correctly produces durable ones. The decision is neither technical nor philanthropic. It is strategic, and it is being made now, in the capital commitments of this cycle.

Frequently asked

How much water does training a large AI model actually use?

Peer reviewed research from the University of California Riverside and the University of Texas estimated that training GPT-3 consumed roughly 700,000 liters of freshwater on Microsoft Azure infrastructure. Subsequent frontier models such as GPT-4 are substantially more compute intensive, and therefore water intensive. The exact figure for recent models has not been published, but the direction is clear: each generation raises the envelope, and inference at global scale ultimately consumes more water than the original training run.

What is a good WUE score for a data centre?

Water Usage Effectiveness below 0.5 liters per kilowatt-hour is considered excellent and is typically achieved only in cool climate jurisdictions such as Iceland or Finland where free air cooling is possible year round. Google’s 2022 fleet wide figure of 1.10 is best in class for a global operator running mixed climates. Most of the industry sits above 2.0 liters per kilowatt-hour, though the majority of operators do not publish their figure at all, which is the real regulatory problem.

Does European regulation require data centres to disclose water use?

The EU Delegated Regulation on data centre sustainability, adopted under the Energy Efficiency Directive, obliges operators above 500 kilowatts of IT capacity to report on energy and water efficiency. The threshold excludes many mid sized operators, the sanction framework is untested, and a binding definition of efficient WUE has not yet been set. Further tightening is expected in the next review cycle, alongside the implementation of the European Water Resilience Strategy adopted in June 2025.

Should water stress influence AI data centre site selection?

Yes, and increasingly it does in practice. Microsoft, Google, and Amazon have all shifted workloads toward Iceland, Finland, and other cool climate jurisdictions. In Arizona, Ireland, and Noord-Holland, local water and grid constraints have already stopped or delayed projects. Dr. Raphael Nagel (LL.M.) argues that water stress scoring belongs in every AI infrastructure investment memo alongside energy price, latency, and tax regime, and that boards that ignore it will face stranded assets by the late 2020s.

Why does semiconductor manufacturing matter for AI water accounting?

AI accelerators require ultra pure water in volumes that dwarf the cooling water of the data centres they eventually power. TSMC consumed approximately 172 million tonnes of water in its main Taiwanese fabs in 2022, in a year when farmers were ordered to fallow paddies to protect fab output. Any honest water footprint of an AI model must count the fab as well as the data centre, and any honest investor memo must price that upstream dependency alongside geopolitical risk around Taiwan.

Claritáte in iudicio · Firmitáte in executione

For weekly analysis on capital, leadership and geopolitics: follow Dr. Raphael Nagel (LL.M.) on LinkedIn →

For weekly analysis on capital, leadership and geopolitics: follow Dr. Raphael Nagel (LL.M.) on LinkedIn →

Author: Dr. Raphael Nagel (LL.M.). About