← Back to Library

Elon Musk – "in 36 months, the cheapest place to put AI will be space”

Elon Musk has a prediction that's hard to ignore: in three years or less, the cheapest place to put artificial intelligence won't be on Earth—it'll be in space. The claim sounds like science fiction, but Musk lays out an engineering case that hasn't quite entered mainstream discourse yet. He's not just speculating about distant futures; he's pointing to immediate constraints in power generation that are already hitting data centers today. His math is striking: solar panels work five times better in space than on the ground, and you skip the cost of batteries entirely. This isn't theoretical—it may already be happening.

The Power Problem No One's Talking About

The total cost of owning a data center is only about ten to fifteen percent energy. The rest is GPUs—expensive silicon that degrades quickly. Move those chips to space, and the economics shift dramatically. But the real bottleneck isn't the chips themselves. It's electricity.

Elon Musk – "in 36 months, the cheapest place to put AI will be space”

Outside China, electrical output has been essentially flat for years. China's power supply is growing fast, but everywhere else, it's static. Meanwhile, AI chip production is scaling exponentially. The problem: you can't turn the chips on without power. Musk calls it the "magical power source" issue—there's no magical electricity fairy waiting to appear.

The math gets stark quickly. One terawatt of solar power would require roughly four terawatts of panel coverage—about one percent of U.S. land area. That's a massive physical footprint. But here's what's catching everyone off guard: data centers already consume far more power than most people realize. A facility with 110,000 GB300 GPUs—including networking hardware, storage, cooling systems, and reserve margin for servicing—needs roughly one gigawatt at the generation level. That's not trivial. That's equivalent to a small city.

And it's getting worse. Cooling alone can require a forty percent increase in power during the hottest days of the year. Add redundancy for servicing generators, and you're looking at another twenty-five percent multiplier on top of that.

Why Space Changes Everything

The appeal isn't about being futuristic—it's about physics. In space, there is no atmosphere. No day-night cycle. No seasonal changes or clouds. Solar panels in space get roughly five times more effective than ground-based panels, and you skip batteries entirely because the sun never stops shining.

Musk points out that solar cells are already absurdly cheap—around twenty-five to thirty cents per watt in China—but putting them in space makes them effectively ten times cheaper when you factor out battery costs. The cost of access to space is falling fast. Combined, these factors create an order-of-magnitude advantage for orbital deployment.

"It's five times more effective than on the ground and you avoid the cost of having batteries to carry you through the night."

The scaling argument gets even starker when you think about how much total power humanity actually uses. The entire United States currently runs on roughly half a terawatt average. A single terawatt would double that consumption. The numbers are forcing people to confront hardware in a way software engineers never had to.

GPU Servicing Isn't the Showstopper

One obvious question: what happens when GPUs fail? Training requires reliable chips, and failure rates can be significant. But Musk pushes back on this narrative. Once past infant mortality—the early debugging phase—modern GPUs are actually quite reliable. Whether they're Nvidia's latest silicon, Tesla's AI chips, or Google's TPUs, the reliability curve stabilizes after initial deployment.

The servicing challenge isn't trivial, but it's not the obstacle people assume either. Past a certain threshold, you don't need constant intervention. The real friction is elsewhere—power generation itself.

The Utility Problem Nobody Solves

This is where Musk's argument gets most concrete. Building power plants on Earth is notoriously difficult. The utility industry moves slowly—they're impedance-matched to government approval processes and public utility commissions. Interconnection agreements can take a year just for the study phase, then another year for results.

Musk describes what it took to get one gigawatt of power online for xAI: coordinating turbines across state lines, building high-power transmission lines, navigating permit issues in Tennessee and Mississippi—very difficult. The turbine backlog is massive; most manufacturers are sold out through 2030. The limiting factor isn't the turbine casings themselves but the blades and veins inside them—a specialized casting process with only three companies in the world capable of producing it.

The tariffs on solar imports into the U.S. are enormous, making domestic production difficult. But SpaceX and Tesla are building toward a hundred gigawatt of solar cell production—doing the entire stack from raw materials to finished cells.

The Regulatory Advantage

Space isn't just physically better for solar—it's regulatorily easier too. Building data centers on Earth requires permits, land acquisition, environmental reviews, and community approvals. In space, you skip all that. There's no weather to account for, no atmospheric loss to mitigate, no weather events that panels must survive.

Musk puts it plainly: it's harder to scale on the ground than in space. The regulatory path is simpler, the physical constraints are fewer, and the economics are fundamentally different.

Critics might note that this analysis underestimates the engineering challenges of maintaining data centers in orbit—radiation exposure, orbital mechanics, launch costs, and the logistics of replacing failed components in space remain significant hurdles. It's not clear these obstacles will prove easier to solve than terrestrial power generation, and some analysts would argue the comparison oversimplifies both domains.

Bottom Line

Musk's prediction is bold but grounded in concrete physics: solar in space is five times more efficient than Earth-based, without battery costs or atmospheric loss. The real constraint isn't GPU reliability—it's whether humanity can generate enough power on Earth to meet AI demands at scale. That problem may already be intractable for ground-based solutions. The strongest part of this argument is the physical math; the biggest vulnerability is that launch costs and orbital infrastructure remain unpredictable, making the timeline uncertain. Watch whether SpaceX's launch cadence and solar production actually materialize within that thirty-six-month window.

Deep Dives

Explore these related deep dives:

Sources

Elon Musk – "in 36 months, the cheapest place to put AI will be space”

by Dwarkesh Patel · Dwarkesh Patel · Watch video

So, are there really three hours of questions or has are you serious? >> Yeah. You don't even talk about Elon, man. >> it's the most interesting point.

All the story lines are kind of converging. Yeah. >> Right now, so we'll see how much >> almost like I planned it. >> Exactly.

Well, we'll get >> I would never do such a thing. So, as better than anybody else, the total cost of ownership of a data center, only 10 to 15% is energy. And that's the part you're presumably saving by moving this into space. Most of it's the GPUs.

If they're in space, it's harder to service them or you can't service them. And so, the depreciation cycle goes down on them. So, like it's just way more expensive to have the GPUs in space presumably. What's the reason to put them in space?

>> well, the availability of energy is the issue. so if you look at electrical output outside of China everywhere outside of China it's more or less flat. It's very maybe a slight increase but for pretty close flat. China has a rapid increase in electrical output.

But if you're putting data centers anywhere except China where you going to get your electricity especially as you scale the output of chips is growing pretty much exponentially but the output of electricity is flat. So how are you going to turn them chips on? >> magical power sources, magical electricity fairies. >> You mean you're famously you're famously a big fan of solar one terowatt of solar power.

So with a 25% compat factor like four terowatts of solar panels it's like 1% of the land area of the United States and that's like far in this you were in the singularity when we've got one terowatt of data centers right so what are we running out of >> how far into the singularity are you though >> you tell me >> yeah exactly so I think I think we'll find we're in the singularity and like okay we still got a long way to go >> but is this like a is the plan to like put it in the space after we've covered Nevada and solar panels >> I think it's pretty hard to cover Nevada in solar panels you get permits from like the permits for try getting the permits for ...