5 min read

Where is the AI Money Actually Going?

Programming note: The next issue lands 13 February, where we'll examine the financing of the AI economy.

120% Capture Rate

Last time, we looked at the napkin math of OpenAI's intelligence treadmill, where every dollar of revenue seems to be immediately consumed by the cost of the machines wearing out. But money doesn't just disappear into the ether; it has to land somewhere. Examine the supply chain for the current AI buildout, and it looks less like a software revolution and more like a massive redistribution of wealth from tech investors to the people who sell silicon and concrete.

Now, if you write a multi-billion-dollar check for a gigawatt of AI capacity, who actually walks away with the cash? According to Bridgewater's analysis of the supply chain, the spending breakdown is remarkably lopsided:

  • Chips: ~56%. More than half of every dollar spent on AI infrastructure goes directly to chipmakers like Nvidia, AMD, or Broadcom (though to be clear, Nvidia is the only one capturing a true lion's share here—generating 8–10x the relevant segment revenue of its closest competitors).
  • Networking & Server Racks: ~12%. The switches, cables, and physical frames that hold the whole thing together and allow the chips to talk to one another.
  • Power & Cooling: ~18%. This is the cost of keeping the lights on and the chips from melting. (It turns out that chasing superintelligence generates an ungodly amount of heat).
  • Construction: ~15%. The physical shell, the steel, and the labor required to put it all together.

The more you stare at that breakdown, the more it reads like a list of constraints rather than a list of costs. Chips are scarce. Power is scarce. Land is scarce. And in a way that no one budgeted for, skilled labor is scarce too. If AI software is eating the world, it's doing it with a fork and knife made of copper wire and HVAC. The bottleneck is no longer compute in the abstract—it's compute that can actually be installed.

Picking Up a Wrench

The construction slice is particularly interesting because it explains the recent career advice coming from the top of the pyramid. Nvidia CEO Jensen Huang has been making the rounds telling Gen Z to put down the Python manual and pick up a wrench. He predicts a boom in "six-figure opportunities" for plumbers, electricians, and steelworkers.

He isn't just being folksy. He is looking at his own customers' spreadsheets. If the world is truly on its way to a $7 trillion infrastructure build-out, that 15% slice represents a trillion-dollar payday for the trades. Huang knows that his chips are useless if there isn't a specialized electrician available to wire the data center. He is effectively begging for more plumbers to clear the bottleneck so he can sell more silicon. It is a moment of a CEO being honest about his own supply chain: he has built the world's smartest brain, but he is currently waiting on a guy with a soldering iron to turn it on.

The $1.20 Mystery

There is, however, a piece of math in this supply chain that feels like a glitch in the Matrix. In an interview with Goldman Sachs, David Cahn of Sequoia Capital observed that for every $1 of revenue that comes in at "the top"—meaning the actual money earned by AI labs like OpenAI or Anthropic from their applications—Nvidia manages to keep about $1.20.

In a normal world, it is difficult to capture 120% of a market's total revenue. But the AI economy is not your typical market; it is a capital project.

The gap is being footed by the middle of the sandwich. Hyperscalers like Microsoft and Meta are not buying chips solely with the money they made from selling AI; they are buying them with the money they made from selling Windows licenses and Instagram ads. They are subsidizing the difference today in the hope that the software catches up tomorrow. (Or, as Cahn put it, someone has to be propping up the supply chain, and it certainly isn't the customer).

The FOMO Mandate

If you are an investor looking at these lopsided economics, you might ask why the hyperscalers are so eager to keep writing checks to Nvidia. The answer, according to both Goldman and Bridgewater, is a mix of high-stakes game theory and the AGI jackpot.

Bridgewater describes this as a "competitive resource grab." Because the components (chips, power, and land) are scarce, the leading players are locked in a race to secure them years in advance. If you don't commit to your $50 billion data center today, your competitor will, and you'll find yourself in 2027 with a great algorithm but no factory to run it in.

Goldman's Joseph Briggs takes an even broader view: he estimates that generative AI will eventually create $20 trillion in economic value. When you are looking at a $20 trillion prize, spending $1 trillion on a resource grab starts to look less like a gamble and more like an insurance premium. As Briggs notes, the revenue potential from productivity gains generally exceeds the current investment forecasts even before you factor in the emergence of AGI.

In other words, the math of asymmetrical regret has taken over. This logic was best explained by Alphabet CEO Sundar Pichai back in 2024:

The one way I think about it is when you go through a curve like this, the risk of underinvesting is dramatically greater than the risk of overinvesting for us here. Even in scenarios where if it turns out we are overinvesting, these are infrastructure which are widely useful for us, they have long useful lives, and we can apply it across and we can work through that.

And so, here we are: in the middle of the largest industrial bet in history. We have the silicon, we have the steel, and we are frantically looking for the electricians. Whether the $20 trillion in software value ever actually materializes is a question for the next decade.[1] For now, the only certain winners are the ones selling the shovels and the ones holding the wrenches.

It is a very expensive way to build a brain, but at least the plumbers are getting paid.

Note [1]: Google DeepMind CEO Demis Hassabis recently estimated that AGI is still five to ten years away. He argues that the industry needs "one or two more big breakthroughs" in areas like long-term reasoning, planning, and efficient memory before the technology can truly fulfill its promise.

More on AI Boom:


On Our Radar

Our Intelligence Desk connects the dots across functions—from GTM to Operations—and delivers intelligence tailored for specific roles. Learn more about our bespoke streams.

Anthropic's New AI Tools Trigger Selloff in Data and Software Stocks

  • The Story: A major selloff hit data analytics and software stocks like Thomson Reuters after AI startup Anthropic launched new "Claude Cowork" plugins that automate complex tasks in legal and data analysis, sparking investor fears of industry-wide disruption. (Reuters)
  • The Investment Implication: The market is aggressively pricing in a direct disruption threat to established data and professional services companies. For investors, Anthropic's new tools represent a tangible example of AI moving from a complementary technology to a direct substitute for high-margin enterprise services, which is causing a fundamental and negative re-rating of the entire sector's long-term valuation.

Nvidia's China Chip Sales Stalled by US Inter-Agency Security Review

  • The Story: Nvidia's plans to resume selling H200 AI chips to China are stalled as the U.S. government conducts a lengthy inter-agency national security review, with the State Department reportedly pushing for tougher restrictions before issuing export licenses. (Financial Times)
  • The Operations Implication: The inter-agency review has become a critical operational bottleneck, halting Nvidia's go-to-market plan for China. This transforms the sales process from a standard commercial transaction into a complex, unpredictable geopolitical negotiation, leaving Nvidia's supply chain and revenue forecast for a key market in a state of high uncertainty.

P.S. Tracking these kinds of complex, cross-functional signals is what we do. If you have a specific intelligence challenge that goes beyond the headlines, get in touch to design your custom intelligence.


You received this message because you are subscribed to ARPU newsletter. If a friend forwarded you this message, sign up here to get it in your inbox.