The AI Money-Go-Round
Sign up for ARPU: Stay ahead of the curve on tech business trends.
The Circular Boom
Last month, we talked about cybersecurity's Kafka-esque problem: the only way to fend off AI-powered hackers is with AI-powered defenders, creating an absurd, self-feeding growth cycle for cybersecurity companies. It turns out, the same circular logic is now propping up the financial foundations of the entire AI boom.
The industry has become a dizzying money-go-round, with the same handful of companies passing billions of dollars back and forth. The architecture of this self-funding boom is worth examining:
- The Nvidia-OpenAI Loop: Nvidia is investing up to $100 billion in OpenAI. A significant portion of that investment will, in turn, be used by OpenAI to buy massive quantities of Nvidia's own chips. It is the purest form of a circular investment: the supplier is funding the customer so the customer can buy more of the supplier's product.
- The AMD-OpenAI Value Capture: AMD is not investing cash in OpenAI. Instead, the chipmaker is giving OpenAI warrants to acquire up to 10% of its stock for a trivial price. In exchange, OpenAI becomes a massive, validating customer for AMD's next-generation AI chips. AMD is effectively subsidizing OpenAI's business with its own equity, betting that the partnership will create more market value than the shares it gives away.
- The Cloud Provider Loop: This is where the web gets truly tangled. OpenAI has a $300 billion cloud contract with Oracle. To fulfill that contract, Oracle is spending billions of dollars buying chips from... Nvidia. A similar dynamic exists with the cloud provider CoreWeave, in which Nvidia is a shareholder. OpenAI has a $6.5 billion deal to use CoreWeave's infrastructure, which, of course, is built primarily with Nvidia's chips.
It is a beautiful, self-perpetuating system. The deals are fueling a historic technological build-out, but they are also raising an uncomfortable question with echoes of a less fondly remembered era. As Bloomberg put it:
Never before has so much money been spent so rapidly on a technology that, for all its potential, remains largely unproven as an avenue for profit-making. And often, these investments can be traced back to two leading firms: Nvidia and OpenAI. The recent wave of deals and partnerships involving the two are escalating concerns that an increasingly complex and interconnected web of business transactions is artificially propping up the trillion-dollar AI boom.
This is where the ghost of the dot-com bust looms large. But as Jeff Bezos argued during a recent tech conference, not all bubbles are created equal. There is a crucial distinction between a purely financial bubble, like the 2008 banking crisis, and an "industrial bubble" built around a transformative technology. Here is Bezos on this point:
The great thing about industrial bubbles... is that when the dust settles and you see who the winners are... society benefits from those inventions... If we go back 25 years ago when the internet was in that bubblish moment... all of that fiber optic cable that got laid, and by the way the companies who laid all that cable went out of business. Literally went bankrupt. But the fiber optic cable was still there and we got to use it.
This is perhaps the most useful way to think about the current AI frenzy. The financial valuations may be irrational. The circular deals may be a sign of unsustainable hype. Many of the companies involved may, like their dot-com predecessors, ultimately fail. But the infrastructure they are building in the process—the data centers, the massive AI factories—will remain. We will, as Bezos said, get to use it. The only problem, of course, is that after the dot-com bust, the world was left with a glut of dark fiber that took nearly a decade to absorb. The legacy of an industrial bubble isn't just the useful infrastructure; it's also the vast and often painful overcapacity that follows.
The Analyst's Memo
Key signals from our members-only intelligence feed that are shaping our thinking this week.
Oracle: The Information, citing internal documents it has seen, reported that Oracle's high-growth AI cloud business is operating on an average gross margin of just 16 percent.
- 📉 The AI Margin Squeeze (ARPU's Take): The report on Oracle's thin AI margins initially sent a tremor through the market, validating the brutal economics of the infrastructure race. However, the stock's rapid recovery reveals the market's current narrative: it is overwhelmingly prioritizing strategic market share capture over near-term profitability. This sets up the critical long-term question for Oracle: are these low margins a temporary cost in a land grab, or a permanent structural feature of the hyper-competitive AI infrastructure market?
Intel: the company has revealed details for its "Panther Lake" laptop processor, the first chip to be built using its next-generation 18A manufacturing process.
- 🏭 The Foundry's First Test (ARPU's Take): Intel is staking its turnaround on the Panther Lake chip, using it as the critical first demonstration of its advanced 18A manufacturing process. This isn't just about a faster chip; it's a public showcase meant to prove to investors and potential foundry customers that Intel's costly and ambitious manufacturing roadmap is finally delivering, aiming to reclaim its technological leadership.
These are just a few of the signals we track and analyze inside the ARPU Intelligence Desk. Members get the complete picture with our full Signal Stream, Briefing Library and monthly Competitive Radar.