The Nuclear Scrapyard
Sign up for ARPU: Stay informed with our newsletter - or upgrade to bespoke intelligence.
Bomb-Grade Data Centers
If you want to build a nuclear power plant in the United States, your primary obstacle is not the laws of physics. It is the Nuclear Regulatory Commission. The NRC is the world's most meticulous gatekeeper, overseeing a civilian licensing process so thorough, so expensive, and so breathtakingly slow that almost no one even tries to navigate it anymore.
Building a "Next-Gen" small modular reactor sounds like a perfect solution for the AI power crunch, but in regulatory terms, "Next-Gen" is just another way of saying "I would like to spend years filling out paperwork."
And so, we have the arrival of HGP Intelligent Energy. The Texas-based power developer has proposed a plan that sidesteps the innovation trap entirely. Instead of trying to invent a new way to split the atom, HGP wants to rummage through the Navy's attic. Here's Bloomberg on the proposal:
[HGP] filed an application to the Energy Department to redirect two retired reactors to a data center project proposed at Oak Ridge, Tennessee, according to a letter submitted to the agency's Office of Energy Dominance Financing. The project, filed for the White House’s Genesis Mission, would produce about 450-520 megawatts of around-the-clock electricity, enough to power roughly 360,000 homes.
HGP's plan includes a revenue share with the government, and the company would create a decommissioning fund.
The developer plans to file for a loan guarantee from the Energy Department, according to the letter. The project would require about $1.8 billion to $2.1 billion of private capital to build related infrastructure to prepare the reactors for general use, according to the proposal submitted to the DOE.
HGP's bet is that the hardware itself is the shortcut. These reactors—A4W units from Westinghouse and S8G-class units from General Electric—are not experimental. They have been powering aircraft carriers and hunting for Red Octobers for decades. They are the most tested, reliable "small reactors" on the planet.
But the real magic is that these reactors exist in a different legal dimension. Military nuclear hardware operates under an entirely separate framework than the local utility plant. HGP is performing a piece of regulatory arbitrage: if they can convince the government to "redirect" these military assets for a "Mission" (in this case, the White House's Genesis Mission for AI), they can bypass the civilian red tape that makes traditional projects impossible.
The numbers on the spreadsheet confirm the arbitrage. HGP estimates that rewiring these reactors would cost between $1 million and $4 million per megawatt. That is a rounding error compared to the cost of a new-build civilian plant. By using existing hardware and fuel that the military already knows how to handle, HGP is attempting to build a high-speed bypass around the NRC.
The only real wall is the fuel itself. Civilian reactors use low-enriched uranium. Military reactors use the highly enriched kind—the stuff you can make a bomb with. This is why the hardware is sealed and the licensing is classified. Attempting to bring this into a civilian context is, as Bloomberg notes, "uncharted territory."
There are two ways to look at this. The first is that it is a sign of complete desperation in the power market. We have run out of easy options, so now we are trying to wire data centers directly to decommissioned weapons of war.
The second view, and the one more relevant to the market, is that the AI boom is forcing a merger between national security and commercial infrastructure. In the race to build superintelligence, the government is being asked to treat a server farm with the same urgency as a carrier strike group. If HGP gets its loan guarantee, it won't be because they built a better reactor; it will be because they successfully argued that in 2026, a data center is just a warship that doesn't move.
More on Nuclear Energy:
- Can nuclear power really fuel the rise of AI? (MIT Technology Review)
- AI's Impact on the Surge of Nuclear Investments (VanEck)
On Our Radar
Our Intelligence Desk connects the dots across functions—from GTM to Operations—and delivers intelligence tailored for specific roles. Learn more about our bespoke streams.
TSMC's Roadmap
- The Headline: TSMC has begun mass production of its 2nm node, with unprecedented demand from AI forcing a simultaneous ramp with mobile customers and driving flagship chip costs to over $300. (TrendForce)
- ARPU's Take: The insatiable, cost-agnostic demand from the AI sector has grown so large that it now dictates TSMC's production roadmap, putting AI data center economics on equal footing with the flagship consumer smartphone market for the first time in history.
- The Operations Implication: The decision to ramp large-die AI chips concurrently with smaller mobile SoCs on a new node is a radical operational shift. It jettisons the industry's long-held risk mitigation strategy of perfecting yields on smaller dies first. This move proves the AI market's demand is so inelastic that the opportunity cost of delaying GPU supply now outweighs the immense financial risk of poor initial yields on massive, $30,000 wafers. TSMC is effectively letting the AI sector's blank check underwrite the operational risk for the entire 2nm generation.
The Cloud Counter-Thesis
- The Headline: Perplexity CEO Aravind Srinivas posits that on-device AI is the primary long-term threat to the centralized data center model, framing it as the industry's "$10 trillion question" and identifying Apple and Qualcomm as the key beneficiaries. (Yahoo Finance)
- ARPU's Take: This is a direct shot across the bow of the hyperscaler capital expenditure narrative. Srinivas is publicly arguing that the current high-cost, centralized approach is a temporary state, not a permanent moat.
- The Investment Implication: This argument introduces a massive, long-term risk vector for any investment thesis predicated on unending data center growth. It bifurcates the AI value chain into two competing capital allocation strategies: (1) the centralized "Compute Landlords" (Nvidia, Cloud providers) who are betting on the cloud, and (2) the decentralized "Edge Enablers" (Apple, Qualcomm, ARM) who win if intelligence moves to the device. Srinivas’s warning forces investors to price in the possibility that the current GPU-fueled data center boom is not a secular supercycle, but a temporary—albeit massive—bridge to a more distributed, power-efficient future.
P.S. Tracking these kinds of complex, cross-functional signals is what we do. If you have a specific intelligence challenge that goes beyond the headlines, get in touch to design your custom intelligence.
You received this message because you are subscribed to ARPU newsletter. If a friend forwarded you this message, sign up here to get it in your inbox.