Jon Y exposes a silent crisis powering the AI revolution: while the world fixates on energy consumption, the massive water footprint of data centers threatens to collide with global scarcity. This piece is notable not for its technical depth alone, but for its stark quantification of the trade-off—revealing that a single 15-megawatt facility consumes as much water annually as three hospitals. For busy leaders in tech and policy, the implication is immediate: the infrastructure for the next decade of intelligence is being built in regions that may not have the liquid resources to sustain it.
The Hidden Cost of Cooling
The core of Y's argument rests on a fundamental physical reality often ignored in boardrooms: almost all electricity consumed by servers eventually becomes heat that must be removed. Y writes, "Almost all of a data center's consumed electricity is converted to heat. Even if a data center isn't working at its full capacity... it is still withdrawing 60-100% of its maximum power." This is a crucial distinction. The demand for cooling is not linear with usage; it is a constant, heavy burden that scales with the facility's maximum potential, not just its current load.
Y details how the industry relies heavily on evaporative cooling towers, a process where water is sacrificed to the atmosphere to dissipate heat. He notes, "About 1% of the water evaporates for every 10 degrees Fahrenheit of cooling... Regardless, evaporated water must be replaced with new, make-up water." This creates a relentless cycle of consumption. The author effectively reframes the efficiency metric, Power Usage Effectiveness (PUE), not just as an energy score, but as a water proxy. The larger the facility, the better the PUE, but the absolute volume of water lost remains staggering. A counterargument worth considering is that air-cooled systems are becoming more efficient, yet Y rightly points out that without water-based cooling, the energy penalty would be even higher, creating a vicious cycle of increased power demand and indirect water use.
The evaporated water leaves the tower as steam. About 1% of the water evaporates for every 10 degrees Fahrenheit of cooling.
The Geography of Scarcity
Perhaps the most alarming section of the coverage is the geographic mismatch between where data centers are being built and where water is available. Y highlights that while tech giants claim sustainability, the majority of their direct water withdrawal still comes from drinkable municipal supplies. He observes, "Some 40-50% of the global population lives in areas suffering water scarcity, and many data centers need to be located near large population centers. Which means adding a new and competing demand for water."
The piece zeroes in on Arizona as a prime example of this tension. Y writes, "SemiAnalysis reports that one of the leading states in the US for data center buildouts is Arizona... But the area also experiences drought-like conditions from time to time and is highly dependent on the Colorado River for its water." This is not a theoretical risk; it is a current operational constraint. Companies like Meta are already responding by funding river restoration projects and seeking long-term storage credits, a move Y describes as a necessary pivot to "source its water using long-term storage credits so no water is taken from the municipal area." However, the sheer scale of the AI boom suggests these mitigation efforts may be playing catch-up.
The AI Accelerant
The commentary takes a sharp turn when connecting these static infrastructure issues to the dynamic explosion of generative AI. The data is sobering: Microsoft's water consumption jumped 34% in a single year, a surge Y attributes directly to the rollout of ChatGPT and similar models. He quotes Bob Blue, CEO of Dominion Energy, who notes a shift in demand patterns: "We're now receiving individual requests for demand of 60 megawatts to 90 megawatts or greater, and it hasn't stopped there... we get regular requests to support larger data center campuses that include multiple buildings and require total capacity ranging from 300 megawatts to as many as several gigawatts."
This section underscores that the problem is accelerating faster than the solutions. Y argues that the correlation between power and water is inescapable. As the industry moves toward more power-hungry chips like Nvidia's upcoming B100, the water demand will follow suit. He posits, "Just imagine water consumption growing that much too." Critics might argue that renewable energy adoption will solve the indirect water usage, but Y correctly identifies that the direct cooling requirement remains a separate, water-intensive bottleneck that solar panels cannot fix.
The Path Forward: Free Cooling and Heat Recapture
In the final analysis, Y explores whether the industry can decouple compute from consumption. He examines "free cooling," which utilizes ambient air or cold seawater to bypass evaporative towers. He cites the example of Google's Hamina facility in Finland, which "takes in fresh seawater from the ocean using the paper mill's existing pipes." While effective, this solution is geographically limited. Y also discusses heat recapture, suggesting that the waste heat could warm nearby homes, but notes the infrastructure hurdles: "The issue is that we cannot efficiently move heat as far as we can move electricity. The demand sources - i.e. the households - need to be relatively close to the data center."
The author concludes that while raising operating temperatures slightly could yield financial benefits, the industry is still grappling with contradictory data on hardware reliability at higher heat levels. The ultimate verdict is that the future of AI is inextricably linked to water availability. Y writes, "If things develop as they seem to be developing, future data centers will need to rapidly adopt a combination of free-cooling, waste heat recovery, and renewable energy like solar."
The future of compute and AI will require far more electricity - and water. First at the semiconductor fabrication stage... Then after the chips are fabbed, we put them into these data centers and run them to full tilt.
Bottom Line
Jon Y's analysis is strongest in its refusal to treat water as a secondary footnote to energy, correctly identifying it as the primary physical constraint on the AI boom's geography. The piece's greatest vulnerability is its reliance on corporate sustainability reports which, while improving, still lack the granular, real-time transparency needed to assess true local impact. Readers should watch for the next wave of regulatory friction in water-stressed regions like the American Southwest, where the clash between digital ambition and hydrological reality will likely force a hard pivot in where and how the next generation of data centers are built.