4 min read

Anthropic's OpenAI Moment

Anthropic's OpenAI Moment

The Gunslinger

There are, it seems, two ways to build a world-changing AI company. The first is the OpenAI way: you raise money at a blistering pace, you spend it at an even more blistering pace, and you bet that by building the biggest, most powerful models—for text, for images, for video, for robots—you will eventually achieve a level of technological supremacy so absolute that the profits will appear.

Then there is the Anthropic way. The story of Anthropic, the AI lab founded by ex-OpenAI researchers, has been one of disciplined caution. It has focused on a narrower set of products, primarily selling its Claude chatbot to corporate customers. It has avoided the costly moonshots into video generation and hardware. And, as a fascinating Wall Street Journal report revealed this week, it has charted a much clearer, and much faster, path to actually making money.

According to financial documents reviewed by the WSJ, the two companies are on wildly different trajectories. Here is the tale of the tape:

  • OpenAI projects its operating losses in 2028 will swell to $74 billion. It expects to burn through 14 times as much cash as Anthropic before finally turning a profit in 2030.
  • Anthropic, by contrast, expects to reach breakeven in 2028—two years ahead of its larger rival—by keeping its costs growing more in line with its revenue.

This is the central divergence in the AI arms race. OpenAI is the swashbuckling gunslinger, willing to burn a hundred billion dollars in pursuit of a multi-trillion-dollar dream. Anthropic is the cautious accountant, building a solid, profitable business one enterprise contract at a time. It's a classic tortoise-and-hare story.

Except, it seems, the tortoise just strapped a rocket to its back.

Just one day after the WSJ's story about its disciplined financial path, Anthropic announced that it plans to spend $50 billion to build its own custom AI data centers across the United States.

This is a monumental, and seemingly contradictory, move. For a company that has built its reputation on caution, this is an almost OpenAI-level of capital expenditure. It's a signal that the brutal physics of the AI business demand one thing above all else: a gargantuan, ever-increasing supply of computing power.

Until now, Anthropic has always been a "tenant," leasing its compute from giant cloud "landlords" like Amazon and Google. Amazon, its largest shareholder, has even built a massive data center complex in Indiana specifically for Anthropic's use. But the new $50 billion plan signals a fundamental shift. Anthropic is now moving to become its own landlord.

This is the great, inescapable paradox of the AI boom. You can be the most disciplined, cautious, enterprise-focused company in the world. But if you want to compete at the frontier of artificial intelligence, you cannot escape the need to spend tens of billions of dollars on infrastructure. It seems the laws of AI physics are immutable. In the end, everyone has to become a big-spending gunslinger.

More on AI Development

  • Reflection AI raises $2B to be America’s open frontier AI lab, challenging DeepSeek (TechCrunch)
  • Baidu dropped an open-source multimodal AI that it claims beats GPT-5 and Gemini (VentureBeat)
  • Is Google Maps the Secret Weapon in the AI War? (ARPU)

On Our Radar

A curated sample from our Intelligence Desk. We deliver these signals customized for specific markets and roles. Learn more about getting your custom intelligence.

Cisco's AI Payday

  • The Headline: Cisco Raises Forecast on Strong AI-Driven Demand, Securing Over $2 Billion in Orders (Reuters)
  • ARPU's Take: The great AI build-out isn't just about the chips; it's about the ultra-fast plumbing needed to connect tens of thousands of them. This is Cisco's sweet spot, and these results show the legacy networking giant is successfully capturing the massive 'second-order' revenue stream created by the GPU boom.
  • The Go-to-Market Implication: This signals a successful go-to-market pivot for Cisco, proving it can capture a significant share of the lucrative AI data center networking market, a segment traditionally dominated by its more specialized rival, Arista Networks. The $2 billion in AI orders, primarily from hyperscalers, validates Cisco's strategy to leverage its scale and supply chain prowess to compete directly on Arista's home turf. This transforms the competitive landscape from a campus vs. data center dichotomy to a head-to-head battle for high-performance AI Ethernet fabrics.

SoftBank's Great AI Re-Bet

  • The Headline: SoftBank Sells Entire $5.8 Billion Nvidia Stake to Finance OpenAI Investment (The New York Times)
  • ARPU's Take: This is a high-conviction trade. SoftBank is cashing out of the single most profitable public AI stock (Nvidia) to go all-in on a private one (OpenAI), making a definitive statement that they believe the real long-term value is in the model, not the silicon.
  • The Operations Implication: This signals SoftBank's bet that the ultimate value in the AI stack will accrue to the model/application layer (OpenAI) rather than the hardware layer (Nvidia). SoftBank is deliberately trading a liquid, publicly-traded asset for a highly concentrated, illiquid position in a single private company, a high-risk move that fully aligns its future with OpenAI's success. For other tech investors, this forces a re-evaluation of where the long-term value in the AI ecosystem will be captured.

P.S. Tracking these kinds of complex, cross-functional signals is what we do. If you have a specific intelligence challenge that goes beyond the headlines, get in touch to design your custom intelligence.


You received this message because you are subscribed to ARPU newsletter. If a friend forwarded you this message, sign up here to get it in your inbox.