5 min read

AI's Unit Math Problem

AI's Unit Math Problem
Photo by Geoffrey Moffett / Unsplash

$800 Billion Interest Payment

The unit math of AI is usually presented in terms of compute: teraflops, parameters, and tokens. But occasionally, someone does the math in terms of dollars, and the result is enough to make a CFO faint.

Last week, IBM CEO Arvind Krishna provided one of the most brutal back-of-the-envelope calculations we have seen. Speaking to The Verge, Krishna laid out the unit cost of the AI revolution: it takes about $80 billion to fill up a one-gigawatt data center.

Since the industry has collectively committed to building roughly 100 gigawatts of capacity, the total bill comes due at $8 trillion in capital expenditure.

Now, eight trillion dollars is a large number. It is roughly equivalent to the GDP of Japan and Germany combined. Interestingly, Krishna's estimate is actually on the high end. In a separate interview, Anthropic CEO Dario Amodei estimated the cost at closer to $50 billion per gigawatt.

But whether the final bill is $5 trillion or $8 trillion, the really scary number isn't the cost to build the data centers; it's the cost to pay for the money used to build them. Here's Krishna:

Let's ground this in today's costs because anything in the future is speculative. It takes about $80 billion to fill up a one-gigawatt data center. That's today’s number. If one company is going to commit 20-30 gigawatts, that's $1.5 trillion of CapEx. To the point we just made, you've got to use it all in five years because at that point, you've got to throw it away and refill it. Then, if I look at the total commits in the world in this space, in chasing AGI, it seems to be like 100 gigawatts with these announcements. That’s $8 trillion of CapEx. It's my view that there's no way you're going to get a return on that because $8 trillion of CapEx means you need roughly $800 billion of profit just to pay for the interest.

To put that $800 billion interest payment in context: the combined net income of Apple, Microsoft, Alphabet, Meta, and Amazon last year was roughly $400 billion. The AI industry is building an infrastructure bill that is double the profit of the entire Big Tech oligopoly—and that's just to service the debt, not to pay back the principal or generate a return for shareholders.

The Negative Margin Trap

But wait, it gets worse. You might think, "Well, they'll just grow their way out of it." But unlike traditional software, where adding a new user costs effectively zero (high fixed cost, zero marginal cost), AI has a variable marginal cost problem.

Every time a user asks a chatbot a question, it costs money (compute/electricity) to generate the answer. This is "inference cost," and here the math gets fuzzy, because OpenAI is a private company and its unit economics are a closely guarded secret. However, according to some detailed bottom-up analysis, the unit economics are currently upside down. One report estimates that OpenAI spent roughly $8.7 billion on inference costs in the first nine months of 2025, while generating only $4.3 billion in revenue.

Other analyses, like one from Reuters, suggest OpenAI's gross margin is closer to 42%. But whether the margin is negative 100% or positive 42%, it is structurally lower than the 70-80% margins that SaaS investors are addicted to. In traditional software, scaling is cheap. In AI, scaling is expensive. This means that simply "growing your way out of it" is much harder, because every new dollar of revenue brings a significant new dollar of cost with it. The industry is betting that future efficiencies will eventually make the unit economics positive. But until then, every new user is technically a liability.

The Cone of Uncertainty

This brings us back to the revenue bet. The industry is betting that the explosive revenue growth will eventually fill this massive hole. But the people placing those bets seem less than certain.

Anthropic's Dario Amodei admitted in a New York Times interview that he is operating in a "cone of uncertainty." His company's revenue has grown 10x year-over-year for three years, but he also said, "I don't believe that" will continue. Yet, he has to order billions of dollars of chips today for revenue that might exist in 2027, because data centers take two years to build.

This mismatch—between the certainty of the expense and the uncertainty of the revenue—has created a market defined by what Amodei calls "YOLOing." Companies that don't have $50 billion are borrowing it from their vendors (Nvidia, Microsoft) to buy chips from those same vendors, booking revenue for everyone involved while the actual cash flow remains theoretical.

We are watching a high-stakes game of chicken played with the world's largest balance sheets. The tech giants are betting that they can invent a new economy faster than the interest payments (and inference costs) can bankrupt the old one. The math suggests that for this to work, AI doesn't just need to be a good product; it needs to be more profitable than the entire current internet.

Or, as Krishna dryly noted about the trillions in spend: "Some people will make money, some people will lose money. All the infrastructure being built will be useful if it goes away." The concrete and the copper will survive the bankruptcy proceedings. The equity, however, might not.

More on AI Economy:

  • Jensen Huang: "AI is a five-layer cake. Energy, chips, infrastructure, models, and applications." (CSIS Interview)
  • The A.I. Boom Is Driving the Economy. What Happens if It Falters? (NYT)

On Our Radar

Our Intelligence Desk connects the dots across functions—from GTM to Operations—and delivers intelligence tailored for specific roles. Learn more about our bespoke streams.

Nvidia's China Re-entry

  • The Headline: The Trump administration has decided to allow Nvidia to sell its H200 AI chips to China, a strategic reversal driven by the assessment that Huawei's own chips are now so advanced that blocking Nvidia would only accelerate China's technological self-sufficiency. (Bloomberg)
  • ARPU's Take: This is a stunning pivot in US tech policy. The administration now believes Huawei's AI chips are so good that a total ban on Nvidia is counterproductive; it would simply force China to build on its own powerful, independent tech stack. The new strategy is to allow Nvidia to compete with a slightly older product to keep Chinese developers hooked on the American CUDA ecosystem.
  • The Operations Question:  The operational impact is a massive inventory unlock for Nvidia. The H200, which is being superseded by Blackwell in the West, now has a huge secondary market in China. This allows Nvidia to extend the revenue lifecycle of the Hopper architecture, effectively dumping its "last-gen" inventory into a hungry market at high margins.

Oracle's OpenAI Hangover

  • The Headline: Investor sentiment has soured on Oracle due to concerns over its heavy financial reliance on a single, unprofitable customer (OpenAI) and the massive debt it is taking on to fund its AI data center buildout. (Reuters)
  • ARPU's Take: Oracle's massive OpenAI contract was a stunning coup that put it at the center of the AI boom. Now, the market is realizing that this blockbuster deal is also a single point of failure, creating immense financial and customer concentration risk that has erased all of its recent stock gains.
  • The Operations Question: This shift in investor sentiment signals a new phase of scrutiny for the AI infrastructure buildout, moving beyond headline-grabbing contract announcements to a focus on financial sustainability. For executives and boards overseeing large-scale AI investments, this serves as a critical case study on the dangers of customer concentration risk. It creates an imperative to balance the pursuit of large "anchor tenant" deals with the need for a diversified and financially resilient customer base.

P.S. Tracking these kinds of complex, cross-functional signals is what we do. If you have a specific intelligence challenge that goes beyond the headlines, get in touch to design your custom intelligence.


You received this message because you are subscribed to ARPU newsletter. If a friend forwarded you this message, sign up here to get it in your inbox.