Why Alibaba Is Giving Away Its Advanced AI Models for Free
Sign up for ARPU: Stay ahead of the curve on tech business trends.
Alibaba Chairman Joe Tsai recently recounted a "holy cow" moment inside the Chinese tech giant. When startup DeepSeek launched its stunningly efficient AI model in January, Alibaba's engineers were so spooked by the risk of falling behind that they canceled their Lunar New Year holidays to work around the clock. But Tsai's more telling revelation came when explaining Alibaba's response: the company is making its own powerful Qwen AI models open-source, effectively giving away some of its most advanced technology for free. This move highlights a fundamental question in the multitrillion-dollar AI boom: if the AI models themselves are free, how does anyone actually make money?
Why would a company give away its most advanced technology?
For Alibaba, the strategy is a modern take on the classic razor-and-blades business model: give away the razor to sell a lifetime of blades. By open-sourcing its Qwen models, Alibaba isn't forgoing revenue; it's aiming to capture it at a different layer of the stack. As Tsai explained at the VivaTech conference, open-sourcing "democratises the usage of AI" and will "proliferate applications." This proliferation creates massive demand for the one thing all AI applications need: computational power.
Every time a developer uses an open-source model to train a new system or an enterprise runs an AI application, it consumes vast amounts of cloud computing resources for both training and inference (the process of using a trained model to generate a response). As a dominant cloud provider in Asia, Alibaba is betting that the revenue it gains from selling this compute power will far outweigh what it might have earned by licensing the model directly. In this view, the AI model is a loss leader designed to fuel a much larger, more profitable business in cloud infrastructure.
How does this compare to other AI business models?
Alibaba’s approach contrasts with the strategies of other major players, creating a clear divide in how the AI market is being monetized:
OpenAI/Microsoft: Theirs is primarily a direct access model. OpenAI sells subscriptions for ChatGPT and charges developers for API access to its powerful proprietary models like GPT-4o and O-series. While Microsoft reaps enormous benefits from running these workloads on its Azure cloud, the core product being sold is access to the AI model's intelligence itself.
Google: Google is pursuing an integration strategy. It is embedding its Gemini models deeply into its existing billion-user products like Search and Workspace, aiming to defend its core advertising business and add value to its enterprise subscription services. While it also offers API access, its primary play is using AI to enhance its existing, immensely profitable ecosystems. Its in-house TPU chips also give it a significant structural cost advantage, allowing it to offer services at a lower price point.
NVIDIA: As the ultimate "picks and shovels" provider, Nvidia's model is the purest. It sells the essential hardware—the GPUs, networking, and full server systems—that everyone else needs to build and run their AI. With its dominant CUDA software ecosystem creating a deep moat, Nvidia is largely agnostic about which AI model wins, as long as the demand for computational power continues to explode.
What does this mean for the AI 'model wars'?
The rise of high-quality open-source models from players like Alibaba, Meta (with Llama), and DeepSeek is accelerating a trend toward the commoditization of the models themselves. If powerful models become readily available and essentially interchangeable for many common tasks, the competitive advantage shifts away from having a single best model.
Instead, the battleground moves to other areas: the efficiency and cost of the underlying compute, the ease of integration into enterprise workflows, and the strength of the surrounding ecosystem. This threatens the long-term defensibility of companies whose primary moat is the perceived superiority of their proprietary model. If an open-source model is "good enough" and dramatically cheaper to run, many businesses may opt for that, putting pressure on the premium pricing of closed models.
What role does competition play in this?
Tsai’s "holy cow" anecdote reveals the intense pressure cooker environment that is forcing these strategic decisions. The fear of being technologically leapfrogged is a powerful motivator for radical moves like open-sourcing a key asset. By making its Qwen models free, Alibaba is not just trying to sell cloud services; it's trying to rapidly build a community of developers and users around its technology, creating a defense against the market-share dominance of closed-model competitors like OpenAI. In this race, the speed of adoption can be as important as raw performance, and giving away the "razor" for free is one of the fastest ways to get it into as many hands as possible.
Reference Shelf:
Alibaba chairman says open source will spur AI apps, cloud demand after ‘huge ordeal’ (South China Morning Post)
The cost of compute: A $7 trillion race to scale data centers (McKinsey)
DeepSeek Races After ChatGPT as China’s AI Industry Soars (Bloomberg)
Microsoft CEO Satya Nadella on His AI Efforts and OpenAI Partnership (Bloomberg)