Construction Firms with Software Valuations
Sign up for ARPU: Stay informed with our newsletter - or upgrade to bespoke intelligence.
Macrohard is Very Hard
The ultimate goal for many Silicon Valley startups is to eventually become a software company. Software is a great business model because its second copy costs the same to produce as its billionth: essentially nothing. It escapes the physical economics of the factory floor, promising high margins, low capital needs, and a product made of logic, not atoms.
For the last year, Elon Musk has been signaling this transition for xAI by using a specific term: "Macrohard." It is a characteristic play on Microsoft, intended to describe an AI-only software powerhouse that will eventually provide the "brains" for the Optimus humanoid robot.
The problem, as a series of internal documents reviewed by Bloomberg this week reveals, is that "Macrohard" is currently looking very, very hard.
xAI reported a net loss of $1.46 billion for the September quarter, up from $1 billion earlier in the year. In the first nine months of 2025, the company burned through $7.8 billion in cash. While revenue did double quarter-over-quarter to $107 million, it is still tracking well behind the $500 million annual goal set for investors last June.
Usually, when a software company loses a billion dollars in three months, it is because they hired way too many expensive engineers or spent too much on a Super Bowl ad. But xAI's burn isn't just about talent; it is about construction. The company is currently busy turning a building in Memphis into a 2-gigawatt data center called "Colossus" and buying billions of dollars worth of Nvidia chips and Tesla Megapacks.
There is an institutional irony here. To build a company that is supposed to be "software-only," Musk has to act like a 19th-century industrialist. You cannot build a Macrohard software "brain" without first becoming a major player in the Tennessee real estate and global electricity markets. Musk wants the valuation of a company that sells "computer logic," while running a business that sells megawatts.
The $1.4 Trillion Utility
xAI is not alone in this software-to-hardware identity crisis. OpenAI is currently running the same playbook, but with an extra zero attached to the figures.
Reports in November indicated that OpenAI expects to rack up operating losses of $74 billion in 2028 alone. They are currently spending about $1.69 for every $1 of revenue they bring in. Like xAI, OpenAI argues that this is merely a "compute deficit" problem—that the demand for AI is insatiable, and the only thing holding back profitability is the lack of physical machines to run the code.
This has led Sam Altman to sign up for $1.4 trillion in infrastructure commitments over the next eight years. If you are an investor, you have to decide which mental model you are using to value these companies:
- The Software Model: You are buying a high-margin "brain" that will eventually scale for free.
- The Utility Model: You are funding a massive, capital-intensive infrastructure project—basically a global power grid for tokens—that will have the thin margins and heavy regulation of a water company.
OpenAI and xAI are currently being valued like the former while spending like the latter. Sam Altman recently argued that concern about his spending would only be "reasonable" if they reached a point where they had compute they could not monetize.
His implied logic is essentially that as long as there is a line of people waiting to talk to a chatbot, a loss is not a loss—it is just unmet demand. It is a convenient way to redefine billions in cumulative burn as a supply chain bottleneck. Here's Altman in a recent interview:
I mean, as revenue grows and as inference becomes a larger and larger part of the fleet, it eventually subsumes the training expense. So that’s the plan. Spend a lot of money training, but make more and more.
If we weren't continuing to grow our training costs by so much, we would be profitable way, way earlier. But the bet we’re making is to invest very aggressively in training these big models.
...we see this consumer growth, we see this enterprise growth. There's a whole bunch of new kinds of businesses that, that we haven't even launched yet, but will. But compute is really the lifeblood that enables all of this.
We have always been in a compute deficit. It has always constrained what we're able to do.
Altman's path to profitability, then, rests on a simple bet: that OpenAI can keep finding buyers for its computing power as fast as it can build the data centers to house it. The "compute deficit" is just a venture capital-inflected way of saying that the demand curve for AI is, for all practical purposes, infinite.
If you believe that, then spending $1.4 trillion isn't a speculative bubble; it's a rational response to an unprecedented market opportunity. If you don't, then you start to see these companies in a different light.
The strategy for both xAI and OpenAI is to keep raising tens of billions of dollars—xAI just closed another $20 billion round this month—to buy the physical world in hopes that they can eventually escape back into the virtual one. They want to be AI software companies, but for now, they are the world's most aggressive construction firms. If the insatiable demand for those tokens ever blinks, they will be left with a lot of very expensive, very hot silicon in a building in Memphis.
More on AI Funding:
- Anthropic signs term sheet for $10 billion funding round at $350 billion valuation (CNBC)
- North American Startup Funding Soared 46% In 2025, Driven By AI Boom (Crunchbase)
On Our Radar
Our Intelligence Desk connects the dots across functions—from GTM to Operations—and delivers intelligence tailored for specific roles. Learn more about our bespoke streams.
The Hardware Hedge
- The Headline: Lenovo, the world's largest PC maker, is leveraging an Nvidia partnership for enterprise AI and launching its own personal AI platform, "Qira," in a strategic pivot from a pure hardware manufacturer to an integrated AI ecosystem player. (Reuters)
- ARPU's Take: This is a pre-emptive defense against the commoditization of the PC. Lenovo sees the writing on the wall: in the AI era, the value shifts from the physical device to the on-device AI agent. Without their own software layer ("Qira"), they risk becoming a low-margin vessel for Microsoft's Copilot, ceding the entire user relationship and a majority of the value to the OS provider.
- The Go-to-Market Implication: Lenovo is executing a two-pronged GTM strategy to capture value at both ends of the AI market. 1) The Enterprise Play: The Nvidia "AI Cloud Gigafactory" partnership allows their enterprise sales force to sell a high-margin, turnkey AI infrastructure solution, leveraging their core strength in supply chain and logistics. 2) The Consumer Play: The "Qira" platform is a direct attempt to build a branded, cross-device ecosystem (PC, phone, tablet) that can be marketed as a unique selling proposition, preventing their hardware from being defined solely by the capabilities of the underlying Windows or Android OS.
P.S. We're building a handful of bespoke intelligence streams for members. If you're facing a specific intelligence challenge, drop us a line here.
You received this message because you are subscribed to ARPU newsletter. If a friend forwarded you this message, sign up here to get it in your inbox.