The AI industry is facing a moment of reckoning. While U.S. tech giants pour billions into massive data centers and ever-larger models, a small Chinese startup named DeepSeek may have quietly demonstrated that the “bigger is better” philosophy is fundamentally flawed.
Just days before DeepSeek went viral, President Trump stood alongside tech titans Sam Altman, Masayoshi Son and Larry Ellison to unveil Stargate — a $500 billion plan to maintain U.S. dominance in AI infrastructure. The timing couldn’t be more ironic.
The success of DeepSeek raises an uncomfortable question: What if we’re building tomorrow’s Rust Belt?
DeepSeek’s latest model achieves what seemed impossible: comparable capabilities to leading models while using significantly fewer resources. Their API costs $0.55 per million input tokens, compared to OpenAI’s $15 — a reduction in computing costs greater than 90 percent. That’s not just an efficiency gain — it’s a fundamental challenge to how we think about AI development.
This efficiency gap becomes even more striking when we consider the open source strategies at play. As Meta’s chief AI scientist, Yann LeCun, notes, it’s not that China’s AI is “surpassing the U.S.,” but rather that “open source models are surpassing proprietary ones.”
This opens up an exciting possibility: what if massive compute spending isn’t the price of progress after all? DeepSeek’s approach suggests that when you combine transparency with efficiency, you create something powerful — a pathway to more sustainable and accessible AI development.
The parallel to American industrial history is stark: U.S. steel companies once continued building massive mills even as more efficient approaches emerged elsewhere. What happens if China does to us in AI what they did with steel — underprice, win the market, and leave our expensive infrastructure sitting idle?
DeepSeek’s success suggests that the future of AI might not belong to those who build the biggest models, but to those who build the most transparent and efficient ones.
Chinese firms aren’t the only ones prioritizing efficiency. While the U.S. bets on big data centers, France is quietly embracing a more nuanced, lifecycle-centric model. Dubbed “frugal AI,” it emphasizes:
- Energy-efficient design: Benchmarking AI systems with green algorithms and eco-design principles;
- Specialized (not generic) models: Favoring task-specific AI over sprawling monoliths;
- Lifecycle approach: From hardware manufacturing to heat recovery, ensuring each phase is sustainable;
- Open reuse of existing models: Aligning with the Association Française de Normalisation’s BP29 on “reusing trained algorithms,” so developers can iterate on proven AI kernels rather than re-inventing them.
France also benefits from a relatively low-carbon energy mix, courtesy of its existing nuclear infrastructure. Rather than pour billions into entirely new power grids, French organizations can focus on demand-side optimization: trimming AI’s resource needs via smaller, carefully targeted models. This echoes several AFNOR recommendations — such as compressing algorithms, reducing data sprawl and systematically reusing pre-trained models — that collectively maximize efficiency, much like what made DeepSeek so successful.
It is unclear what happens next. But one thing is clear: transparency and efficiency, both from a cost and energy perspective, must be front and center in our development and deployment of AI.
Large-scale infrastructure spending can bring long-term benefits. Much like the dot-com era’s boom-and-bust led to the overbuilding of fiber networks — which later fueled the rise of cloud computing and streaming — today’s AI investments could similarly pay off. However, these benefits are not guaranteed.
While AI scaling is inevitable, the infrastructure of today establishes the architectural frameworks that will shape how AI systems evolve. Our investments therefore need to align with both today’s demands and tomorrow’s possibilities. The real challenge, therefore, isn’t just how much we build but how we build it: creating flexible frameworks that can support evolving AI innovations.
The race for AI supremacy won’t be won by whoever builds the biggest data centers. It’s about who can build the smartest, most transparent and efficient ones. The question is, with $500 billion committed to Stargate, will the U.S. secure a sustainable leadership position, or will continued excess ultimately breed waste?
Aya Saed is director of AI policy and strategy at Scope3.