How OpenAI Paid for Its AMD Chips
Sign up for ARPU: Stay ahead of the curve on tech business trends.
The Kingmaker
Ordinarily, when you buy a product, you pay for it with money. OpenAI, it seems, has found a way to pay for its AI chips with something more ephemeral: hype.
This week, the AI startup announced a landmark deal to purchase tens of billions of dollars' worth of AI processors from AMD. The market's reaction was immediate and explosive: AMD's stock soared 24%, adding roughly $80 billion to its market value.
And here is where the transaction gets strange. As part of the arrangement, AMD is giving OpenAI warrants to acquire up to 10% of its stock for a trivial price—just one cent per share. This is, in effect, a massive kickback. OpenAI will still pay cash for the chips, but it gets the option to receive billions of dollars in AMD stock for virtually nothing in return. And the logic is a perfect artifact of the current AI frenzy: OpenAI's endorsement is so powerful that the announcement itself creates enormous value for AMD's shareholders. The warrants are simply OpenAI's way of taking a cut of the very wealth it just generated.
As Bloomberg's Matt Levine put it:
This deal between OpenAI and AMD was obviously going to create a lot of stock-market value... Why not use that value to subsidize the deal? ... In rough numbers, OpenAI is getting back half of the value it created for AMD.
Which raises the question: is this a brilliant move by AMD, or a desperate one? Ordinarily, you don't give away a tenth of your company to a customer to convince them to buy your product. It suggests you couldn't get them in the door otherwise.
But this wasn't just a financial play; it was a deep engineering partnership. And to understand its importance, you have to understand that the real war against Nvidia has never been just about hardware; it's about software. For years, Nvidia's CUDA platform has created a powerful moat that has locked in developers and starved competing platforms of oxygen. AMD has faced a classic chicken-and-egg problem: its software (ROCm) couldn't become the standard without large-scale users, and it couldn't get large-scale users without being the standard.
The OpenAI deal shatters that problem. It's not just about getting feedback on the specs of the upcoming MI450 chip. It's about force-maturing AMD's entire software stack in the most demanding production environment on Earth. By embedding its engineers with the world's leading AI lab, AMD is effectively paying for a decade's worth of ecosystem development and real-world validation, compressed into a few years.
One way to think about the warrants, then, is as the price AMD was willing to pay for this strategic validation. And it turns out to have been a brilliant wager. The roughly $80 billion in market value AMD gained on Monday was more than double the potential value of the equity it handed over. Is a potential 10% stake in the company a steep price to pay for that kind of co-development? Or is it a very efficient way to pay for the most intense R&D accelerator in the company's history?
For now, the deal appears to be a victory for AMD, anointing it as a credible, large-scale supplier of cutting-edge AI chips and officially opening a "second front" in the chip war. For Nvidia, it is a signal that the days of being the only game in town are over. OpenAI, by playing the role of kingmaker, has not only secured the chips it needs but has also reshaped the entire competitive landscape in its favor.
The Box-Mover's Revenge
The first phase of the AI gold rush was simple: you buy Nvidia. But as every major company on the planet rushed to do just that, a second, more industrial problem emerged. It turns out that a chip is not a data center, and the real bottleneck is no longer just designing the silicon, but the decidedly less glamorous work of actually building and shipping the machines that hold it.
And so you get the strange resurgence of Dell. This week, the company best known for its beige office PCs nearly doubled its long-term profit forecast, citing "insatiable" demand for its AI servers. The driver for this explosion is its infrastructure group, which is now projected to grow at 11-14% annually.
This is not a story about a brilliant new technology. It is a story about the brutal, industrial logic of logistics. While the world was mesmerized by the wizards at Nvidia designing the magic chips, the more immediate problem became one of supply chains. The AI revolution requires a massive, physical build-out, and the most valuable skill is no longer just genius, but competence at a global scale. This is where Dell's decades of experience as the world's most efficient box-mover have become a formidable, and perhaps unexpected, competitive advantage. Here is Reuters:
Insatiable demand for servers that provide the computing power needed to run services such as ChatGPT has turned Dell into one of the biggest winners of the generative AI boom.
Its strong profit growth expectation may also ease investor concerns about the margin hit from competition in AI servers and the high costs of building the products.
"Dell has a volume advantage due to its scale, established supply chain, and relationships with major buyers, compared to rivals like Super Micro," Emarketer analyst Jacob Bourne said.
This dynamic should feel familiar. It is, in fact, the second act of the same play we saw in an even less glamorous corner of the data center: the humble hard disk drive. Just as with servers, the AI boom's insatiable appetite for data storage has transformed the fortunes of Western Digital and Seagate, a duopoly that now enjoys newfound pricing power after years of being treated as a low-margin commodity.
The lesson is clear: the AI revolution is as much about industrial brawn as it is about digital brains. The story of Dell, then, is a companion piece to the story of Western Digital. While the wizards at Nvidia design the magic, the real value is increasingly spilling over to the industrial builders—the companies that can reliably assemble, ship, and install the infrastructure of the new world.
The Scoreboard
- AI: Oracle stock slips on report company is seeing thin cloud margins from Nvidia chips (CNBC)
- Semiconductor: Intel to reveal tech details on forthcoming PC chip, sources say (Reuters)
- EV: Tesla debuts 'affordable' Model Y and 3 that strike some as too expensive (Reuters)
Enjoying these insights? If this was forwarded to you, subscribe to ARPU and never miss out on the forces driving tech: