Broadcom Sees AI Spending Surge Lasting Until End of Decade
Sign up for ARPU: Stay ahead of the curve on tech news.
Silicon Valley's frenzied investment in artificial intelligence (AI) shows no signs of slowing down, according to Broadcom CEO Hock Tan, speaking to the Financial Times. Tan predicts this spending spree will continue until the end of the decade, driven by the insatiable need for advanced computing power to fuel AI development and deployment.
"They are investing full-tilt," Tan told the Financial Times, referring to his clients in Silicon Valley. "They will stop when they run out of money or when shareholders put a stop to this."
Tan's comments follow Broadcom's remarkable performance in the AI chip market. In fiscal year 2024, the company reported a staggering 220% surge in AI revenue, reaching $12.2 billion. This dramatic increase propelled Broadcom's market capitalization past the $1 trillion mark for the first time.
Analysts suggest that Broadcom's AI chip clients include tech giants such as Google, Meta, and ByteDance (TikTok's parent company). The company is also reportedly collaborating with OpenAI and Apple on custom server chips designed to accelerate AI training and deployment.
This surge in demand for AI chips comes as tech companies seek alternatives to Nvidia, the current dominant force in the market for high-performance processors needed to train large language models.
Tan played down speculation that Broadcom might attempt to rescue Intel, the struggling US chipmaker, stating that his company is fully focused on the burgeoning AI semiconductor market. "That is driving a lot of my resources, a lot of my focus," Tan said.
Tan's comments highlight the massive investment underway in data centers to support the creation and operation of increasingly complex AI models. OpenAI's "Colossus" facility in Memphis, boasting 100,000 Nvidia GPUs, exemplifies this trend. Tan anticipates that by 2027, Broadcom's customers will be building clusters of up to 1 million AI chips.
While the full impact of generative AI on mainstream businesses remains uncertain, Tan is confident that Big Tech companies see immense potential for revenue generation. "They need to train [AI] on a scale that the world has hardly ever seen before," Tan said. "That consumes huge amounts of silicon. That's where we show up."
The rapid advancements in generative AI, driven by the principle that more data and computing power result in smarter models, necessitate a constant increase in AI chip demand. "They have a formula to keep doing it and they are not at the end of the formula yet," Tan noted. "All roads lead to: you need more computing chips."