← Back to Library

AI datacenters drink more water than you think

The Hidden Water Cost of AI Infrastructure

Vikram Sekar's piece cuts through the noise surrounding AI datacenter water consumption by exposing a critical flaw in how we measure environmental impact. The debate has become polarized between those claiming datacenters are trivial water users and those warning of ecological catastrophe. Sekar argues the truth depends entirely on design choices — and that today's solutions may simply trade one environmental problem for another.

The Burger Metric Problem

The most viral claim in this debate comes from SemiAnalysis, which stated that xAI's Colossus 2 datacenter consumed the same water as 2.5 In-and-Out burger restaurants. David Sacks flagged this as a "narrative violation" — a contrarian take that captured millions of attention.

AI datacenters drink more water than you think

Vikram Sekar writes, "Although there is nothing wrong in flagging contrarian takes like Sacks did, it captures the attention of millions and unintentionally provides a means for policy makers to unknowingly understate the importance of water use in AI datacenters."

The dissonance is stark. If datacenter water use is truly negligible, why did Google's facility in The Dalles, Oregon, initially refuse to disclose consumption as a trade secret? Why did legal battles reveal the facility was using a quarter of the city's entire water supply? And why did The Dalles recently attempt to expand its reservoir by drawing from Mount Hood National Forest, triggering immediate environmental concern?

"Not all infrastructure buildouts are reducible to burger metrics."

Sekar notes the Google Oregon facility used evaporative cooling with hydroelectric power from Columbia River dams — where evaporation pushes water consumption above 10,000 gallons per megawatt-hour. xAI's Colossus 2, by contrast, uses dry adiabatic cooling and on-site aeroderivative turbines requiring zero water. The same infrastructure, vastly different profiles.

Critics might note that comparing datacenters to fast food oversimplifies both industries — beef production's water footprint is geographically distributed, while datacenter consumption is intensely localized, creating acute regional stress.

Carbon Versus Water Tradeoffs

Understanding the water-power connection requires thermodynamics. The Rankine cycle — used in coal and nuclear plants — boils water into steam to drive turbines. It is water-hungry by design. Nuclear reactors are actually more thirsty than coal plants due to lower thermal efficiencies from safety constraints on reactor temperatures.

The Brayton cycle works differently. Air is compressed, fuel is burned, and hot expanding gas drives a turbine directly. No steam. No water needed. This is what aeroderivative simple-cycle turbines use.

Vikram Sekar writes, "Natural gas is the one fuel source that can use both cycles. In a simple-cycle configuration, natural gas is burned to spin a turbine and generate electricity. Combined-Cycle Gas Turbines use the Brayton cycle to burn gas, drive turbines and generate electricity. But they also capture the hot exhaust and feed it into a heat recovery steam generator to drive a second steam turbine on top of the first."

Combined-cycle gas turbines hit roughly 60 percent efficiency compared to 35 to 40 percent for simple-cycle, but the steam side needs water for cooling — typically 400 to 1,200 gallons per megawatt-hour. Most grid-scale natural gas plants today are combined-cycle because efficiency gains justify the water costs.

Behind-the-Meter Power and Its Costs

The electrical grid cannot keep up. Between power plant construction and interconnection infrastructure — transformers, transmission lines, substations — wait times range from four to twelve years. Datacenters cannot wait. Billions in invested infrastructure must generate revenue quickly.

The solution is behind-the-meter power generation on-premises. Three options exist: aeroderivative simple-cycle turbines burning natural gas, solid oxide fuel cells converting natural gas electrochemically, or massive solar arrays with storage. Sekar focuses on the first two, both producing carbon dioxide.

Vikram Sekar writes, "The burning of natural gas produces prodigious amounts of nitrogen oxides that have an immediate impact on the local environment. What happened in xAI's Colossus 1 in Memphis is a cautionary tale. The company deployed 35 gas turbines on-site to power its GPU cluster before a grid connection was available, and local residents quickly raised alarms about the stench and air quality."

Two emission reduction methods exist. Dry Low Exhaust technology uses pre-mixed air and fuel to lower combustion temperatures, but makes partial-load operation difficult and does not prevent benzene emissions. Water injection demineralized water into the turbine to reduce flame temperature — consuming 25 to 50 gallons per minute per turbine. Twenty turbines powering a gigawatt datacenter could use one million gallons per day just for emission control.

Colossus 1 did not use water injection because it was built in record time. Setting up water treatment infrastructure would have delayed the project. Once a 150 megawatt substation came online, xAI began removing the turbines. About half remain for Phase 2 until a second substation completes, then they become backup.

Critics might note that treating emission-control water consumption as optional rather than standard understates the true environmental cost of rapid deployment timelines.

Solid Oxide Fuel Cells

Solid oxide fuel cells use a solid oxide electrolyte to convert natural gas into electricity through electrochemical reaction, not combustion. They virtually eliminate nitrogen oxide pollutants, achieve over 50 percent electrical efficiency, require no water for power generation, and deploy faster than turbines.

Bloom Energy markets a 100 megawatt per acre configuration by vertically stacking power sources — about twice the density of turbines and 100 times solar panels. Solid oxide fuel cells are more than twice as expensive on capital expenditure basis, but simpler permitting and lower land use make them attractive.

Natural Gas Bottlenecks

Behind-the-meter technologies are intended as bridges to future grid interconnection. Aeroderivative turbines and solid oxide fuel cells will persist as backup sources or power boosters. They cannot serve as long-term solutions for one reason: natural gas availability.

Vikram Sekar writes, "If dozens of GW-class datacenters all use natural gas on-premises to power their infrastructure, the next big bottleneck on the horizon will be natural gas pipelines. Building gas pipelines is as difficult as building electrical interconnection infrastructure and can take years from start to finish."

Pipeline ruptures are single points of failure while electrical grids have built-in redundancy. Behind-the-meter power plants only shift the electrical infrastructure bottleneck to gas infrastructure.

Nuclear Inevitability

This leads to the only option for large-scale, zero-emission power: nuclear energy. Nuclear provides carbon-free power through fission, not fossil fuels. It delivers over 90 percent capacity factor — natural gas is 50 to 60 percent, renewables under 40 percent. Nuclear runs continuously, unlike renewables generating power six to nine hours daily then requiring expensive storage. Land footprint is minimal. Supply chain concerns are negligible.

But nuclear uses more water than coal power at roughly 670 gallons per megawatt-hour. Cooling towers release steam for good reason.

Vikram Sekar writes, "To power a GW-class datacenter like Colossus with nuclear power, it takes 0.67 million gallons of water per hour, or 16 million gallons per day — which is 16 times higher than SemiAnalysis' estimate."

The water consumption is highly localized while beef production is spread out. Nuclear remains highly sought after for datacenter power, but the water footprint cannot be dismissed.

Bottom Line

The burger comparison is a temporary truth that obscures long-term reality. Today's behind-the-meter gas solutions trade water for carbon; tomorrow's nuclear solutions trade carbon for water. Sekar's piece forces a uncomfortable conclusion: AI infrastructure cannot escape thermodynamics. Every design choice creates environmental tradeoffs, and the burger metric — while catchy — fails to capture the localized, intensifying pressure datacenters place on regional water systems. The Mount Hood reservoir expansion attempt in The Dalles proves the stakes are real, not rhetorical.

Sources

AI datacenters drink more water than you think

by Vikram Sekar · Vik's Newsletter · Read full article

You can download this post as an epub/pdf for free using the button below. Paid subscribers get every deep-dive in downloadable digital formats. If you’d like to only purchase specific reports, you can see the growing digital asset collection here.

Note: This post is an exploration around a SemiAnalysis article on water use. I discuss tradeoffs I feel are not discussed in the original post. It’s not meant to minimize their work in the semiconductor space. I am a regular reader, and appreciate all the content they put out.

The “water footprint” of AI datacenters has become a hotly debated topic. The most widely cited study on the issue is a paper titled “Making AI less thirsty: uncovering and addressing the secret water footprint of AI models“ by UC Riverside and UT Arlington researchers, set the stage by stating that GPT-3 needs to “drink” a 500 ml bottle of water for roughly 10–50 medium-length responses.

This debate reached a boiling point with the widely-reported controversy surrounding Google’s datacenter in The Dalles, Oregon, where the company initially refused to disclose its water consumption, claiming it was a trade secret. Subsequent legal battles eventually revealed Google was using a quarter of all the water in the entire city. The most striking development is The Dalles’ recent attempt to expand its water reservoir by pulling from the Mount Hood National Forest, an action that drew immediate concern from environmental groups.

In this post, we take a closer look at the SemiAnalysis article that promises I can use “Grok for 668 years, 30 times a day, every single day” for the water footprint of eating a single burger. I know I’ll still take the burger, but there are some important consequences of comparing datacenters to beef that we should talk about. I recommend you read the original article first.

We’ll explore how water use and carbon footprint are finely intertwined, and how datacenter design choices affect both. Not all infrastructure buildouts are reducible to burger metrics.

The Tokens-per-Burger Comparison.

A SemiAnalysis post titled “From Tokens to Burgers: A Water Footprint Face-off,” states that a datacenter like xAI’s Colossus 2 consumed the same amount of water as 2.5 In-and-Out burger restaurants, a popular fast food chain in California. David Sacks identified it as a “narrative violation” - a series of tweets he puts out whenever there is something contrary to popular opinion. Although there is ...