2 min read

Amazon Steps Up Effort to Rival Nvidia in AI Chip Market

Amazon is making significant strides in its quest to challenge Nvidia's dominance in the AI chip market, reports the Financial Times. The tech giant's cloud computing division is investing heavily in custom chips, spearheaded by Annapurna Labs, a chip start-up acquired by Amazon in 2015.

The latest development is the upcoming release of Trainium 2, part of a line of AI chips designed to train the largest AI models. Trainium 2 is already being tested by Anthropic, a competitor to OpenAI backed by Amazon, as well as Databricks, Deutsche Telekom, and Japan's Ricoh and Stockmark.

Amazon's goal is to offer a compelling alternative to Nvidia, currently the world's most valuable company thanks to its dominance of the AI processor market.

"We want to be absolutely the best place to run Nvidia," said Dave Brown, vice-president of compute and networking services at AWS. "But at the same time we think it’s healthy to have an alternative."

Amazon highlights the cost savings offered by its chips, with Inferentia, another line of specialist AI chips, already demonstrating 40% lower operating costs for generating responses from AI models.

"The price [of cloud computing] tends to be much larger when it comes to machine learning and AI," Brown said. "When you save 40 per cent of $1,000, it’s not really going to affect your choice. But when you are saving 40 per cent on tens of millions of dollars, it does."

Amazon's investment in AI chips is substantial, with the company expecting to spend $75 billion in capital expenditures in 2024, with the majority allocated to technology infrastructure. This represents a significant increase from 2023, when the company spent $48.4 billion for the entire year.

The move by Amazon to develop its own data center chips is part of a broader industry trend, with Microsoft, Google, and Meta also designing their own chips to support AI growth.

"Every one of the big cloud providers is feverishly moving towards a more verticalised and, if possible, homogenised and integrated [chip technology] stack," said Daniel Newman at The Futurum Group to the Financial Times.

Amazon's approach involves building everything from the silicon wafer to the server racks, all underpinned by its proprietary software and architecture.

"It’s really hard to do what we do at scale. Not too many companies can," said Rami Sinno, Annapurna's director of engineering.

Despite these efforts, Amazon has yet to significantly dent Nvidia's dominance in the AI infrastructure market. Nvidia recorded $26.3 billion in revenue from AI data center chip sales in its second fiscal quarter of 2024, while Amazon's entire AWS division generated $26 billion in the same period.

While Amazon avoids direct performance comparisons and doesn't submit its chips for independent benchmarks, experts believe that offering customers more choice could be a key factor in challenging Nvidia's market share.

"People appreciate all of the innovation that Nvidia brought, but nobody is comfortable with Nvidia having 90 per cent market share," said Patrick Moorhead of Moor Insights & Strategy, to the Financial Times. "This can’t last for long."