8 min read

Oracle is an OpenAI Credit Story

Oracle is an OpenAI Credit Story
Photo by BoliviaInteligente / Unsplash

Programming note: We've just released our thematic report: The Inference Economy: Where AI Margins Actually Come From in 2026. It's a deep-dive into the cost-per-token collapse, who's actually capturing margin across the AI stack, and why inference efficiency is emerging as the industry's decisive second competitive axis. Access a free copy here. We return next Friday with some thoughts on China's AI development.

The Layoff That Built a Data Center

On March 31, Oracle told up to 30,000 employees they no longer had jobs. A week later, it was finalizing the largest debt syndication in data center history.

The timing is not a coincidence. It is an arithmetic equation.

Normally, when a tech giant cuts that much of its workforce, it is a signal that demand for its product is collapsing. At Oracle, it is because demand is too expensive. The company's trailing twelve-month free cash flow has plunged to negative $24.7 billion. Wall Street estimates that the mass layoffs will free up about $8 to $10 billion in incremental cash. The layoffs are not a sign that the AI boom is passing Oracle by; they are a financing mechanism for participating in it. In the new AI economy, you fire your humans so you can afford to pay the interest on your servers.

That is the Oracle story in miniature. Strip away the cloud valuation multiples, the infrastructure narrative, and the $553 billion backlog, and what you have is a highly leveraged credit bet on a single counterparty that has never turned a profit. Oracle is no longer an enterprise software company with a debt problem. It is primarily an OpenAI credit story wearing a cloud multiple. The rest is just concrete, Nvidia chips, and a lot of very nervous bondholders.

From Database to Datacenter

Eighteen months ago, Oracle was a legacy database company with a credible but unremarkable cloud business. Today it is a leveraged AI infrastructure bet in public markets. On September 10, 2025, the stock surged 36% in a single day, briefly making Larry Ellison the world's richest person and the company a trillion-dollar market cap. Today it sits 45% below that peak, as investors worked through the less glamorous details.

The transformation rests on a technical bet that is worth understanding, because it explains why Oracle got the OpenAI contract rather than AWS. Oracle built OCI's cluster networking around RDMA over Converged Ethernet, which reduces latency between GPUs during distributed AI training. The architecture turned out to matter: OCI grew 84% last quarter, faster than any major IaaS provider.

The architecture is real. The growth is real. But it is worth asking what made the Oracle slice of OpenAI's compute strategy so different. OpenAI is, of course, a multi-cloud customer; it still signed a $38 billion multi-year deal with Amazon last November. The answer is partly technical and partly financial. Oracle was willing to sign a contract at a scale and on terms that no investment-grade company with a functioning treasury department would normally sanction. The technical differentiation got Oracle into the room. The willingness to torch its own balance sheet is what closed the deal.

The Arithmetic

Two years ago, Oracle's capital expenditure as a percentage of revenue was 20%; today it is 108%. TTM free cash flow fell from positive $11.8 billion to negative $24.7 billion over the same span. The company is planning to issue equity for the first time in fifteen years. A new CFO was appointed specifically to manage the cash crunch.

Credit markets have formed a view. Last month, Bloomberg reported that Oracle's five-year credit default swaps hit a record of 198 basis points — a level not seen since 2008. S&P and Moody's both have negative outlooks. Oracle now sits two notches above junk and has become, per credit market participants interviewed by Bloomberg, the most liquid proxy for AI infrastructure credit risk in the investment-grade universe.

That is not a compliment you would necessarily frame and hang on the wall.

Oracle's debt syndication took months to place, involved more than two dozen banks, and needed lenders to reach into Asia and insurance company balance sheets to fill the book. The Michigan campus required Pimco to step in as anchor after Bank of America spent months trying to assemble it. Banks ended up holding larger portions than anticipated.

The layoffs, in this context, are not a surprise. They are arithmetic. The company needs cash. Salaries are cash.

The Actual Bet

When you strip the AI narrative from Oracle's investment case, what remains is this: you do not own a diversified infrastructure story. You own a very elaborate opinion about one counterparty's future solvency.

Here is what that looks like in practice.

The $300 billion OpenAI compute contract runs for five years beginning in 2027. Assuming an even distribution, that is $60 billion per year—roughly equal to Oracle's entire corporate revenue last fiscal year. There is just one problem: OpenAI's current annualized revenue is estimated at around $25 billion, and its projected 2026 cash burn is roughly $17 billion. As D.A. Davidson's Gil Luria bluntly put it: "OpenAI is in no position to make any of these commitments."

For OpenAI to comfortably honor $60 billion annually to Oracle alone—before it pays its staff, trains its models, or cuts checks to AWS and Microsoft—it needs to be a company that does not yet exist, generating revenue it has not yet earned, from a business model it has not yet fully proven.

So how does a company with $25 billion in annualized revenue sign a $300 billion contract? Through a financial mechanism that critics compare to the vendor financing of the late-1990s telecom bubble. The money effectively moves in a giant, leaky triangle:

  1. Nvidia commits $30 billion of equity funding to OpenAI.
  2. OpenAI uses that financial backing to help underwrite a $300 billion future compute contract with Oracle.
  3. Oracle takes on massive debt to buy roughly $40 billion worth of GPUs from Nvidia for its Abilene data center.

If you are checking the math, you will notice a missing zero. Nvidia's $30 billion check covers exactly one-tenth of OpenAI's promise to Oracle.

Nvidia CEO Jensen Huang recently noted that this $30 billion "might be the last" check Nvidia writes before OpenAI goes public. Which means Oracle isn't just vendor-financing a startup; it is vendor-financing an eventual OpenAI IPO. Oracle is torching its own free cash flow and stressing its credit rating to buy Nvidia chips, on the assumption that SoftBank, Silicon Valley, and eventually the public stock market will step in to make its primary customer solvent enough to pay the rent.

And if you look closely at Oracle's financial filings, you can see the sheer gravity of this arrangement. Oracle will point out that its $553 billion Remaining Performance Obligation (RPO) is a massive cushion. But the RPO is not a pile of cash sitting in a vault. Only 12% of it is recognizable within the next twelve months; the vast bulk lands in years three through five.

Oracle has noted that some of this backlog requires no incremental cash outlay because customers are prepaying for GPUs. That is a genuine mitigant. But fundamentally, the $553 billion number is a highly illiquid asset. It is a long-dated promise from an entity that is itself running on ambition and external financing.

The Zoning Board of Superintelligence

There is a second way Oracle's bet can fail, and it has nothing to do with OpenAI's cash balance. It has to do with the physical world.

In enterprise software, a $553 billion backlog is usually treated as money in the bank. But a cloud backlog is not a software license; it is a construction pipeline. And the physics of that pipeline are currently breaking down.

Bloomberg reported last December that Oracle is pushing the completion dates for some of its OpenAI data centers from 2027 to 2028. Oracle quickly denied the report, in the particular way large companies deny things: "There have been no delays to any sites required to meet our contractual commitments, and all milestones remain on track."

But whether Oracle wants to admit it or not, the broader industry math is unforgiving. Of the 12 gigawatts of data centers slated to open in the US this year, only a third are actually under construction, according to Sightline Climate. Data Center Watch, which tracks local opposition, counts another $64 billion in projects blocked or delayed by community pushback as of early 2025.

It turns out that building AI infrastructure requires two things that tech companies are not used to waiting for: community approval and electrical switchgear. You can raise billions of dollars from Wall Street in a matter of weeks, but you still have to wait for transformers to ship from China, and you still have to convince the local town council in Virginia that your 100-foot-tall cooling fans won't keep them awake at night.

In a normal cloud business, a one-year construction delay is annoying. In Oracle’s highly leveraged credit loop, a delay is mathematically violent.

If you borrow tens of billions of dollars to build an Abilene data center, the interest clock starts ticking immediately. If completion slips from 2027 to 2028, that is 12 months where Oracle is paying bondholders but cannot recognize revenue from OpenAI. The longer the physical buildout takes, the deeper the cash hole gets. You cannot pay your interest expense with a delayed circuit breaker, and you cannot serve AI tokens out of an empty field that is still waiting on a zoning permit.

The Verdict That Has Not Arrived Yet

The credit markets and the equity markets are currently telling different stories, and both cannot be right. Credit markets see meaningful stress probability. Equity analysts carry a consensus price target roughly 40% above current levels.

Equity investors are pricing Oracle as a cloud infrastructure company with a large backlog and a temporarily stressed balance sheet. Credit is pricing it as a heavily leveraged lender to a single borrower with a $207 billion funding shortfall and no profits. Both descriptions are factually accurate. They just assign very different probabilities to the same outcome.

A backlog is only as good as the entity that signed it, and the zoning board that permits it. The customer that signed the largest portion of it is running on ambition and external financing, and the concrete it requires is stuck in municipal hearings and global supply chain bottlenecks. That is either the defining commercial relationship of the AI era or the most concentrated counterparty risk in the history of enterprise software.

Oracle's bet is breathtakingly aggressive. If OpenAI succeeds, Oracle officially becomes the fourth hyperscaler. If OpenAI stumbles, Oracle becomes the world's most over-leveraged landlord, stuck with billions in specialized real estate. The asymmetry is brutal. But at least the layoffs paid for another few thousand GPUs.

Related reading:

  • OpenAI Stargate: where the US sites stand (Epoch AI)
  • The Inference Economy: Where AI Margins Actually Come From in 2026 (ARPU)

📊 Data > Narrative

We pull key data points to show you the mathematical reality of what's happening in tech.

The Data: Nearly 40% of US data center projects due in 2026 are at risk of delays exceeding three months, per SynMax satellite analysis reported by the FT. At Oracle's own 1.4GW Shackelford County campus — the site being built for OpenAI — only one of the six planned facilities showed active development.

The Takeaway: Oracle's $553 billion backlog requires two things to go right simultaneously: OpenAI has to grow into the contract, and Oracle has to deliver the capacity on time. The construction data suggests the second assumption is now at least as uncertain as the first.


You received this message because you are subscribed to ARPU newsletter. If a friend forwarded you this message, sign up here to get it in your inbox.