The real story isn't whether AI will change the world—it's whether the world can afford to build it.
The $1.4 Trillion Question
OpenAI has signed more than $1.4 trillion in infrastructure commitments over recent months, with the goal of building out data centers needed to meet soaring demand. But the company is nowhere near having the money required to complete those deals. Its losses are staggering: Microsoft revealed that OpenAI lost roughly $11.5 billion in a single quarter—its worst on record. That pushes year-to-date losses north of $25 billion against projected annual revenue of about $20 billion.
The company has raised nearly $58 billion in equity so far and was valued at $500 billion last month. It's now talking about an IPO at a $1 trillion valuation next year, which would float the shares on an exchange and possibly bring in about $60 billion in cash. But that's just over 4% of its $1.4 trillion infrastructure commitments.
To bridge the gap, OpenAI has leaned on increasingly creative deal structures. Nvidia pledged up to $100 billion in reciprocal investments, while AMD granted OpenAI warrants to buy 10% of its stock for a penny per share if deployment milestones are met. If all these deals worked out, OpenAI could bring in roughly $200 billion—but that still leaves them $1.2 trillion short.
The Backstop Gambit
This was not the first time OpenAI has looked to Washington for help. Just a month ago, the company sent a detailed letter to the White House urging the federal government to double down on semiconductor subsidies, asking that tax credits be expanded to cover the entire AI supply chain from chip fabrication to data centers and grid hardware.
Sarah Frier, OpenAI's CFO, explained at a Wall Street Journal event why she's looking for what she called an "ecosystem of banks, private equity, maybe even governmental" ways governments can come to bear—meaning a federal subsidy or backstop that guarantees financing. The idea would drop the cost of financing and increase the loan-to-value ratio—the amount of debt that can be taken on top of an equity portion.
The problem is straightforward: each gigawatt of compute costs around $50 billion, with $15 billion for land and infrastructure and $35 billion for GPUs. While people know how to finance data centers—typically with 20, 25, even 30-year lives—chips have not been as easy to finance because their life as a frontier chip remains uncertain. The more innovation that happens with chips, the faster they can be expected to depreciate. And so the $35 billion worth of chips in a $50 billion data center are very difficult to finance. People don't want to own them if they might collapse in value when a new one comes out, and people really don't want to accept them as collateral on a loan.
The result? Banks wouldn't want to lend, and the interest rate on a loan backed by rapidly depreciating chips would be so high that you would need the government to back the loans.
Critics might note that OpenAI's CFO quickly walked back her suggestion in a LinkedIn post later that day, saying she had meant that the government needed to play their part in combination with the private sector to contribute to America's AI growth and that OpenAI was not seeking a government backstop for their infrastructure commitments. Sam Altman also tweeted that OpenAI does not have or want government guarantees for its data centers.
The Economics Don't Add Up
The unit economics of running the current generation of large language models is dire. The incentive seems to be for all players to just grow the top line as much as possible, even if adding more users just leads to greater and greater losses. The models have negative unit economics—a fancy way of saying the company loses money on every sale and tries to make it up on volume.
In AI, costs rise almost linearly with usage, which is very different from traditional software. There's no marginal cost magic going on. Despite an invitation-only roll out, OpenAI may be losing around $15 million a day or $5 billion annualized on Sora 2, its AI video generating app.
Meanwhile, Nvidia posted a 62% jump in revenue for the three months to October, far ahead of expectations. Data center sales hit $51.2 billion, and the company raised its revenue forecast for the current quarter to $65 billion. For now, the numbers seem to justify the hype—but the worry is that the revenue it's earning and the growth rate of that revenue is ultimately unsustainable.
The question hanging over Silicon Valley is not so much whether AI will change the world, but whether the world can afford to build it.
Bottom Line
The core problem remains: OpenAI has committed to $1.4 trillion in infrastructure with no clear path to pay for it. Creative financing deals with Nvidia and AMD bring in hundreds of billions—but leave a trillion-dollar gap that requires government backing to make the economics work. The company is essentially asking taxpayers to guarantee loans against rapidly depreciating AI chips, while burning tens of billions per year with no end in sight. Whether AI will transform the world remains uncertain; what seems increasingly clear is that building it requires money that doesn't exist.