Most discussions of artificial intelligence fixate on the algorithms themselves, but Kenny Easwaran shifts the lens to the brutal, unglamorous reality that powers them: electricity, cooling, and a fragile global supply chain. This is not a lecture on code, but a wake-up call about the physical infrastructure required to sustain the digital revolution. Easwaran argues that the exponential growth of AI is hitting a hard wall of physics, where the demand for compute is outpacing the very laws of hardware evolution.
The Hidden Cost of Intelligence
Easwaran begins by dismantling the assumption that digital services are weightless. He notes that while companies once offered free services to sell ads, the new model is to "provide free services in order to gather data from users that they then sell." However, the true bottleneck isn't data collection; it's the energy required to process it. The author draws a sharp historical parallel to the late 19th century, noting that just as Thomas Edison and George Westinghouse competed to sell electricity, the modern world is facing a similar inflection point. "Electricity generation in the United States kept rising" for over a century, but Easwaran points out that "a few years after 2000 electricity generation has leveled off." This plateau is about to be shattered by the electrification of everything, but specifically by the voracious appetite of data centers.
The commentary here is crucial: the heat generated by computation is as much a problem as the power consumed. Easwaran explains that while efficiency has improved, "it's still a significant fraction" of energy used just to cool the machines. This physical constraint dictates geography. The author observes that data centers are clustering in places like Portland, Oregon, and Reykjavik, Iceland, not for tax breaks, but because they offer "excellent access to cheap and clean electricity" and cold water for cooling. This reframes the AI boom from a software race to a resource war. The location of intelligence is now determined by the availability of hydro, geothermal, or wind power.
"The biggest distinctive new demand for neural net-based AI is computational power."
The Exponential Gap
The most startling evidence Easwaran presents is the sheer scale of the increase in compute power required to train modern models. Using a logarithmic scale, he illustrates that "GPT 4 released just over a year ago took about 100 times as much computation to train as GPT-3 did." The jump from early models like AlexNet to today's giants is staggering; the author notes that training a modern system requires "nearly a million times as much computation." This exponential curve is outpacing the hardware that runs it.
Easwaran addresses the famous observation by Gordon Moore, noting that while transistor density has doubled every year or two for decades, "the amount of computational power used by neural nets has gone up by over a factor of well over a million in a decade." Moore's Law is no longer keeping pace with the hunger of the algorithms. The author traces the history of computing hardware, from the vacuum tubes of the 1950s to the magnetic tapes and mercury delay lines of the early era, to show that technology is always in flux. "There's no reason to think that [the transistor] will always continue being the most important one," Easwaran writes, suggesting we are approaching a limit where transistors are merely single atoms.
Critics might note that this focus on raw compute power overlooks potential breakthroughs in algorithmic efficiency, where smarter code could reduce the need for brute-force calculation. However, Easwaran's data on the current trajectory suggests that for the foreseeable future, the industry is betting on scale, not just smarts.
The Geopolitics of Silicon
The piece pivots from physics to geopolitics, revealing a supply chain that is astonishingly concentrated. Easwaran highlights that the Graphics Processing Units (GPUs) essential for AI training are dominated by a single company, NVIDIA, which now holds "about 80% of all GPUs." The market value of this dominance is unprecedented; NVIDIA recently "passed Google and Amazon and even Saudi Aramco... to become the third most valuable corporation in the world." This rise from outside the top 10 in 2021 to a global titan in 2024 underscores the sudden, massive shift in economic power toward chip designers.
But the story doesn't end with design. Easwaran explains that neither NVIDIA nor AMD manufactures their own chips; they rely on foundries like the Taiwan Semiconductor Manufacturing Corporation (TSMC). The critical bottleneck, however, lies even further upstream. The equipment needed to print these microscopic circuits—lithography machines—is controlled by a Dutch firm, ASML, which supplies "nearly 80% of the world's supply." "None of these corporations are based in China," Easwaran emphasizes, listing the US, South Korea, Taiwan, Japan, and the Netherlands as the exclusive hubs of this critical infrastructure. This concentration has led to strict export controls, with the author comparing the potential impact on China's AI development to the "1973 Arab Oil Embargo."
"There's a possibility that Intel might move into this business and give up on their own chip design."
This framing of the semiconductor industry as a geopolitical choke point is the piece's most potent insight. It moves the conversation from "who has the best AI model" to "who controls the factory that makes the brain." The author notes that while there are disagreements among allied nations about the strictness of these embargoes, the structural reality remains: the hardware required for advanced AI is geographically and politically constrained. A counterargument worth considering is whether China's massive domestic market and state-backed investment could eventually break this monopoly, but the current consensus among the author's data suggests a significant lag is inevitable.
Bottom Line
Kenny Easwaran's strongest contribution is exposing the physical fragility of the AI boom, proving that the future of intelligence is bound by the limits of electricity and the geography of chip manufacturing. The argument's vulnerability lies in its assumption that current hardware trajectories are immutable, potentially underestimating the speed of alternative computing paradigms. Readers should watch for how the global tension over chip exports reshapes the pace of innovation in the coming decade.
"The biggest distinctive new demand for neural net-based AI is computational power."
The most striking takeaway is that the race for artificial intelligence has become a race for physical resources, where the winners are determined not just by code, but by access to clean energy and the most advanced lithography machines on Earth.