Today’s market reaction to DeepSeek’s R1 model was dramatic. The Nasdaq dropped 3.1%, and Nvidia saw its stock plummet by 17% - the largest single-day market cap loss in Wall Street history. The narrative? A Chinese hedge fund’s ability to train and run powerful AI models at a fraction of the cost threatens the entire AI ecosystem.
Their R1 model, performs on par with today’s state-of-the-art systems, though in AI world, nobody can stay at the top of a leaderboard for long - and that’s a very very good thing. Interestingly, the news about a high performing model from DeepSeek being trained for cheap was announced last month. R1 made it to the headlines though.
When we talk about models, there’s an important distinction: DeepSeek released an open source model, which means anyone can download and run it on their own hardware. When companies (or consumers) use these AI models, they’re paying for the computing power that runs them - the GPUs and infrastructure processing the data. That infrastructure can be owned and operated by US companies, keeping both the data and revenue on US soil. So while DeepSeek may have created an incredibly cost-efficient model, that doesn’t mean money will flow to them - and it certainly doesn’t mean we’ll be spending less on AI overall.
Jevons paradox, which Microsoft CEO Satya Nadella pointed out today, has helped guide disucssions about AI adoption at Kamiwaza. When steam engines became more efficient, coal consumption didn’t decrease - it skyrocketed. Lower costs unlocked new possibilities, new uses, new scale.
The exact same dynamic is playing out in AI. As I wrote about reasoning models when OpenAI’s o1 was released, the future of AI compute goes far beyond training costs. The real compute demands come from running these models. By 2025’s standards, we’re barely scratching the surface of AI compute usage.
Markets focus on headlines - a $6M training breakthrough grabs attention. But DeepSeek’s achievement signals something bigger: when powerful technology becomes accessible, its use explodes.
For Nvidia and other datacenter-related companies, this means one thing: more demand, not less. The world won’t need fewer GPUs - it will need them everywhere. Factory floors, hospital beds, traffic systems (they’re already in our phones and cars). The infrastructure play isn’t shrinking; it’s expanding into every corner of the economy.
DeepSeek didn’t just make AI cheaper - they proved what’s possible. And what’s possible is that AI is about to be everywhere.