← Back to Library

How AI is ruining the electric grid

Sam Denby delivers a jarring reality check: the invisible hand of artificial intelligence isn't just reshaping our economy, it is actively degrading the physical quality of the electricity powering our homes. While most debates focus on carbon footprints or energy volume, Denby exposes a subtler, more immediate threat—harmonic distortion that literally makes your refrigerator rattle and your electronics die sooner. This is not a distant climate crisis; it is a mechanical failure happening in your kitchen right now.

The Invisible Noise

Denby begins by dismantling the binary view of power as simply "on" or "off." He argues that electricity has a quality, and AI is ruining it. "Artificial intelligence is making your electricity worse," he writes, noting that while we usually think of power in terms of availability, it can be judged in a qualitative sense as well. The author illustrates this with the standard 60 Hz sine wave of North American power, explaining that AI data centers are causing it to look "a bit more like this," filled with harmonic distortions.

How AI is ruining the electric grid

This is a crucial distinction. The piece explains that while some distortion is normal, there is an 8% threshold considered safe for household appliances. Denby points out that "when a household is near a major data center cluster, research has indicated that the likelihood that readings exceed that threshold goes dramatically up." The consequence is not just annoyance but physical damage. He describes how a refrigerator motor, subjected to fluctuating current, creates oscillations that lead to a rattling sound. "That's to say, AI is making your refrigerator louder," Denby states. "And that noise is more than just annoying. It's representative of mechanical stress that will lead that motor to die out sooner."

The argument here is visceral and grounded in tangible consumer experience rather than abstract grid theory. By linking the abstract concept of "harmonic distortion" to the concrete reality of a broken appliance, Denby makes the stakes personal. Critics might argue that modern appliances are built with better filters to handle such noise, but the author's reliance on the 8% threshold suggests that current hardware is not designed for the specific, high-frequency chaos introduced by these new data centers.

That's to say, AI is making your refrigerator louder. And that noise is more than just annoying. It's representative of mechanical stress that will lead that motor to die out sooner.

The Semi-Truck Problem

The commentary then shifts to the hardware driving this demand. Denby contrasts the steady, predictable power usage of traditional cloud computing with the erratic, massive spikes of AI. He uses a powerful analogy to explain why Graphics Processing Units (GPUs) are so energy-intensive. "If a CPU, which is what you'll typically find running the show within a data center, is an electric car capable of taking care of one process extremely quickly and efficiently, then a GPU is a semi-truck, getting a whole lot of tasks done at once, but taking a good bit of gas to do it."

This metaphor effectively captures the scale of the problem. The author notes that while a single chip like Nvidia's H100 might consume 700 watts, the real issue is scale. "To run any sort of AI infrastructure, a data center needs tens of thousands of these GPUs to run even the simplest of large language models," Denby writes. The result is a facility that consumes power like a small city. Furthermore, the demand is not just high; it is accelerating exponentially. He points out that while GPT-1 required 18,000 petaflops, GPT-4 required 21 billion. "When or if the demand or the technological advancements that make such data processing possible to begin with ever begin to slow is difficult to tell," he admits, highlighting the uncertainty of future growth.

The strength of this section lies in its rejection of the idea that efficiency gains will solve the problem. Denby argues that "the issue is not just how much power AI uses, it's how and where it uses it." This reframing is vital. It moves the conversation away from simple conservation to the structural incompatibility between AI's operational needs and the grid's design.

The Grid's Breaking Point

Perhaps the most alarming part of Denby's coverage is the discussion of grid stability. He explains that the electric grid operates on a razor-thin margin where supply must match demand instantly. The problem arises when AI data centers, which are hypersensitive to power fluctuations, suddenly disconnect to protect their hardware. Denby recounts a specific incident in Northern Virginia on July 10, 2024, where a lightning arrestor failure triggered a cascade. "The effect of all these data centers dropping off the grid simultaneously led to an almost instantaneous loss of 1,500 megawatts of load," he writes. "1,500 megawatt is enormous. That's like if all of Iceland or Namibia or Jamaica went offline."

This sudden drop creates a frequency spike that grid operators cannot instantly correct. Denby notes that "physics dictated the response," causing the grid frequency to rise from 60 Hz to 60.047 Hz. While this deviation seems small, it demonstrates a terrifying vulnerability. "The proportion of demand that could disappear in an instant will grow larger and so too will the consequences of their impacts on the grid," he warns. The author connects this to historical blackouts, noting that "cascading frequency issues such as this were a major component of the 2025 blackout in Spain and Portugal, as well as other notorious blackouts like Texas in 2021."

The argument is compelling because it highlights a paradox: the very safety mechanisms data centers use to protect their expensive chips (switching to batteries instantly) are what threaten the stability of the wider grid. A counterargument worth considering is that grid operators are aware of these risks and are likely upgrading infrastructure, but Denby points out a significant bottleneck: "In the case of Loudoun County, home to data center alley, there's not, at least not yet," referring to the lack of capacity. This has led to a 4 to 7-year backlog for connection requests, creating a tense standoff between tech expansion and grid reality.

The proportion of demand that could disappear in an instant will grow larger and so too will the consequences of their impacts on the grid.

Bottom Line

Sam Denby's strongest move is shifting the AI energy debate from carbon emissions to grid physics, proving that the technology's impact is already degrading the reliability of everyday appliances. The argument's biggest vulnerability is its reliance on the assumption that current grid upgrades cannot keep pace with the exponential growth of AI compute, a race that could tip either way. Readers should watch for the next major grid fault in data center-heavy regions, as that will be the true stress test of this theory.

Sources

How AI is ruining the electric grid

Artificial intelligence is making your electricity worse. You might not have even known that was a possibility for your power to be good or bad. We typically experience electricity in a binary sense of available or not. But it can be judged in a qualitative sense as well.

And the electricity near AI data centers, well, it's bad. This is what your power is supposed to look like, at least in North America, a 60 Hz sign wave. AI, however, is causing it to look a bit more like this. These are harmonic distortions, deviations from perfection.

Now, power is never perfect. There's always some harmonic distortion, but there's a threshold that is considered acceptable, 8%. And when a household is near a major data center cluster, research has indicated that the likelihood that readings exceed that threshold goes dramatically up. This has implications.

Your refrigerator, for example, includes a motor to operate its compressor. When the current is not smooth, the fluctuations in power will lead to fluctuations in torque as it spins around its axis. Meaning, in addition to the rotational force it's intended to create, it'll also create oscillations as it spins. As these oscillations interact with the motor casing and surrounding components, it creates noise, a rattling sound.

So, that's to say, AI is making your refrigerator louder. And that noise is more than just annoying. It's representative of mechanical stress that will lead that motor to die out sooner, spoiling your groceries and mandating a costly, cumbersome fix. The same applies to all the rest of your electronics.

The 8% harmonic distortion threshold is set since it's what normal household appliances can handle without shortening their lifespans. So, incrementally and ever so slightly, AI is costing you money by making your electronics wear down faster. It's no secret that artificial intelligence in its current form is tremendously energyintensive. That undoubtedly matters, but demand is potentially the simpler problem to solve, especially when ignoring carbon goals.

The issue is not just how much power AI uses, it's how and where it uses it. Making AI a net positive for everyday Americans in particular considering the tech sector's concentration in the country will be an enormous challenge. Not just due to the potential implications on employment and creativity and disinformation and more, but also just due to the sheer strain on the electric grid. This decade's most ...