Tech Weekly: How "Distillation" is Disrupting the AI Business Model
Sign up for ARPU: Stay ahead of the curve on tech business trends.
The race to build the biggest, most powerful AI models may be over. A technique called "distillation," which allows smaller models to learn from larger ones, is rapidly commoditizing cutting-edge AI, making it cheaper and more accessible than ever before. This increasingly becomes a challenge to the business models of companies that have bet billions on proprietary, resource-intensive AI.
For years, the leading players in AI, including OpenAI, Google, and Anthropic, have operated on the principle that bigger is better. They've invested billions of dollars in building massive "foundation models" trained on vast datasets, requiring enormous computing power. The assumption was that this scale and complexity would create a significant competitive advantage, a "moat" protecting them from smaller rivals.
However, the emergence of DeepSeek, a Chinese startup, has thrown this assumption into question. Using distillation of foundation models built by larger companies, DeepSeek is able to build AI models with a fraction of the resources of their US counterparts, but at comparable or even superior performance at key benchmark tests.
At its core, distillation works by learning from a bigger, advanced model:
Through distillation, companies take a large language model — dubbed a “teacher” model — which generates the next likely word in a sentence. The teacher model generates data which then trains a smaller “student” model, helping to quickly transfer knowledge and predictions of the bigger model to the smaller one. (Financial Times)
This process, while not entirely new, has gained significant traction recently, fueled by advances in research and the availability of open-source models. The power of this approach, according to Olivier Godement, head of product for OpenAI’s platform, is that:
Distillation is quite magical. It’s the process of essentially taking a very large smart frontier model and using that model to teach a smaller model...very capable in specific tasks that is super cheap and super fast to execute.
DeepSeek serves as the prime example of distillation's disruptive power. The company's success has not only challenged the notion that massive resources are required for cutting-edge AI, but it has also highlighted the growing importance of efficiency in AI development:
DeepSeek didn’t invent distillation, but it woke up the AI world to its disruptive potential. It also ushered in the rise of a new open-source order — a belief that transparency and accessibility drive innovation faster than closed-door research. (CNBC)
The implications for the established AI players are profound:
If those investments in [infrastructure] don’t provide companies with an unbeatable advantage but instead serve as springboards for cheaper rivals, they might become difficult to justify... “Is it economically fruitful to be on the cutting edge if it costs eight times as much as the fast follower?” said Mike Volpi, a veteran tech executive and venture capitalist who is general partner at Hanabi Capital. (WSJ)
The rise of distillation also throws a spotlight on the open-source vs. closed-source debate. DeepSeek has leveraged open-source models from Meta and Alibaba, and has, in turn, made its own models open for developers. This collaborative approach contrasts sharply with the proprietary strategies of companies like OpenAI. On this open approach, Yann LeCun, Meta's chief AI scientist, said:
We’re going to use [distillation] and put it in our products right away. That’s the whole idea of open source. You profit from everyone and everyone else’s progress as long as those processes are open.
This open approach is rapidly gaining momentum. Last month, researchers at Stanford and the University of Washington creating a reasoning model similar to OpenAI's for under $50:
The combination of distillation and open-source models is dramatically lowering the barriers to entry in the AI field. This increased competition could drive down prices, making advanced AI capabilities accessible to a much wider range of businesses and individuals:
Prices for software developers accessing AI models from OpenAI and others have fallen dramatically in the past year. Open-source AI such as DeepSeek’s only promises to lower costs further, according to tech executives. (WSJ)
In a world where leading AI players spend billions in advancing AI systems, yet smaller competitors can always catch up in a matter of months, does a first-mover advantage truly hold value?
Autonomous Vehicles
Nvidia's Automotive Revenue Surges on Driver-Assist Tech Demand
Nvidia's automotive segment revenue more than doubled to a record high in the latest quarter, driven by strong demand for its driver-assist software, CNBC reports. Revenue in the automotive and robotics segment rose 103% year-over-year to $570 million in the fourth quarter of fiscal year 2025, bringing the segment’s total revenue for the year to $1.69 billion.
The increase is attributed to sales of Nvidia's self-driving platforms. CEO Jensen Huang envisions all cars on the road becoming "robotic cars" that collect data for Nvidia-supported AI systems. Analysts anticipate continued growth, fueled by investments in autonomous vehicles and a rising demand for driver-assist systems, with the Chinese EV sector being a prominent adopter of Nvidia's technology.
Quick Hits
- DeepSeek's Bold Claim: DeepSeek claims a theoretical 545% daily profit, a figure that's raising eyebrows in the industry.
- SoftBank's Funding Push: SoftBank seeks a $16 billion loan to further its investments in artificial intelligence.
- Dell's Bullish Forecast: Dell forecasts $15 billion in AI server sales this year, driven by a strong profit outlook and climbing AI server backlog.
- Intel's Setback: Intel delays the opening of its Ohio chip plant to the next decade, pushing back the original 2026 production start date.
- OpenAI's Latest Model: OpenAI unveils GPT-4.5 ‘Orion,’ marking its largest AI model release to date.