← Back to Library

Does OpenAI expect a government bailout?

The real story isn't whether AI will change the world—it's whether the world can afford to build it.

The $1.4 Trillion Question

OpenAI has signed more than $1.4 trillion in infrastructure commitments over recent months, with the goal of building out data centers needed to meet soaring demand. But the company is nowhere near having the money required to complete those deals. Its losses are staggering: Microsoft revealed that OpenAI lost roughly $11.5 billion in a single quarter—its worst on record. That pushes year-to-date losses north of $25 billion against projected annual revenue of about $20 billion.

Does OpenAI expect a government bailout?

The company has raised nearly $58 billion in equity so far and was valued at $500 billion last month. It's now talking about an IPO at a $1 trillion valuation next year, which would float the shares on an exchange and possibly bring in about $60 billion in cash. But that's just over 4% of its $1.4 trillion infrastructure commitments.

To bridge the gap, OpenAI has leaned on increasingly creative deal structures. Nvidia pledged up to $100 billion in reciprocal investments, while AMD granted OpenAI warrants to buy 10% of its stock for a penny per share if deployment milestones are met. If all these deals worked out, OpenAI could bring in roughly $200 billion—but that still leaves them $1.2 trillion short.

The Backstop Gambit

This was not the first time OpenAI has looked to Washington for help. Just a month ago, the company sent a detailed letter to the White House urging the federal government to double down on semiconductor subsidies, asking that tax credits be expanded to cover the entire AI supply chain from chip fabrication to data centers and grid hardware.

Sarah Frier, OpenAI's CFO, explained at a Wall Street Journal event why she's looking for what she called an "ecosystem of banks, private equity, maybe even governmental" ways governments can come to bear—meaning a federal subsidy or backstop that guarantees financing. The idea would drop the cost of financing and increase the loan-to-value ratio—the amount of debt that can be taken on top of an equity portion.

The problem is straightforward: each gigawatt of compute costs around $50 billion, with $15 billion for land and infrastructure and $35 billion for GPUs. While people know how to finance data centers—typically with 20, 25, even 30-year lives—chips have not been as easy to finance because their life as a frontier chip remains uncertain. The more innovation that happens with chips, the faster they can be expected to depreciate. And so the $35 billion worth of chips in a $50 billion data center are very difficult to finance. People don't want to own them if they might collapse in value when a new one comes out, and people really don't want to accept them as collateral on a loan.

The result? Banks wouldn't want to lend, and the interest rate on a loan backed by rapidly depreciating chips would be so high that you would need the government to back the loans.

Critics might note that OpenAI's CFO quickly walked back her suggestion in a LinkedIn post later that day, saying she had meant that the government needed to play their part in combination with the private sector to contribute to America's AI growth and that OpenAI was not seeking a government backstop for their infrastructure commitments. Sam Altman also tweeted that OpenAI does not have or want government guarantees for its data centers.

The Economics Don't Add Up

The unit economics of running the current generation of large language models is dire. The incentive seems to be for all players to just grow the top line as much as possible, even if adding more users just leads to greater and greater losses. The models have negative unit economics—a fancy way of saying the company loses money on every sale and tries to make it up on volume.

In AI, costs rise almost linearly with usage, which is very different from traditional software. There's no marginal cost magic going on. Despite an invitation-only roll out, OpenAI may be losing around $15 million a day or $5 billion annualized on Sora 2, its AI video generating app.

Meanwhile, Nvidia posted a 62% jump in revenue for the three months to October, far ahead of expectations. Data center sales hit $51.2 billion, and the company raised its revenue forecast for the current quarter to $65 billion. For now, the numbers seem to justify the hype—but the worry is that the revenue it's earning and the growth rate of that revenue is ultimately unsustainable.

The question hanging over Silicon Valley is not so much whether AI will change the world, but whether the world can afford to build it.

Bottom Line

The core problem remains: OpenAI has committed to $1.4 trillion in infrastructure with no clear path to pay for it. Creative financing deals with Nvidia and AMD bring in hundreds of billions—but leave a trillion-dollar gap that requires government backing to make the economics work. The company is essentially asking taxpayers to guarantee loans against rapidly depreciating AI chips, while burning tens of billions per year with no end in sight. Whether AI will transform the world remains uncertain; what seems increasingly clear is that building it requires money that doesn't exist.

Deep Dives

Explore these related deep dives:

Sources

Does OpenAI expect a government bailout?

by Patrick Boyle · Patrick Boyle · Watch video

In recent weeks, there's been a lot of market anxiety about the sustainability of the AI boom. This was partly driven by the outrage around Sarah Frier, Open AI's finance chief, floating the idea that a government backs stop for its $ 1.4 trillion data center buildout might be a good idea. Frier quickly walked back her suggestion in a LinkedIn post later that day, saying that she had meant that the government needed to play their part in combination with the private sector to contribute to America's AI growth and that open AI was not seeking a government back stop for their infrastructure commitments. Her statement, while attempting to calm the outrage, only confused matters even further about how the not yet profitable startup plans to pay for its massive AI data center and chip commitments.

Sam Alman tweeted on the Everything app, "We do not have or want government guarantees for open AI data centers. We believe that governments should not pick winners or losers and that taxpayers should not bail out companies that make bad business decisions or otherwise lose in the market. Then it turned into a Bill Aman tweet at that point where he went on and on for around 20 pages. At first I was thinking who would write a tweet that long and then I realized that he had probably just used chat GPT.

He knew that people would only read the first few lines, but he wanted to seem thoughtful, so had to turn out an entire novel. The core problem for Open AI is that they've signed more than $1.4 trillion in infrastructure commitments over the last few months with the goal of building out the data centers that it says are needed to meet soaring demand, but they are nowhere near having the money required to complete those deals. Frier gave the example of having to hold back Sora 2 for months due to compute constraints. >> I just want to be clear what it means when I say we're computed.

It means that for example we cannot roll out our new models when they are ready. So when Sora 2 was ready to when Sora 2 actually launched there was probably a good six seven month actually gap there. >> Okay. The agreements that they've signed have raised lots of questions around how a cash burning company with tiny ...