Back to stories
Industry

Thinking Machines Lab and Nvidia Announce Gigawatt-Scale AI Partnership

Michael Ouroumis2 min read
Thinking Machines Lab and Nvidia Announce Gigawatt-Scale AI Partnership

Thinking Machines Lab, the AI startup co-founded by former OpenAI CTO Mira Murati, has secured one of the largest compute partnerships in AI history. The multi-year strategic deal with Nvidia will give Thinking Machines access to at least one gigawatt of next-generation Vera Rubin systems — a commitment that rivals the infrastructure footprint of established AI giants.

What the Deal Includes

The partnership, announced jointly by Nvidia CEO Jensen Huang and Murati, covers three key pillars. First, Thinking Machines will deploy Nvidia's Vera Rubin platforms at scale to train its frontier AI models, with rollout beginning in 2027. Second, Nvidia has made a "significant investment" in Thinking Machines, though neither party disclosed the exact figure. Third, the two companies will collaborate on technical optimizations, ensuring Thinking Machines' products are tuned specifically for Nvidia's silicon.

A gigawatt of compute is a staggering figure. For context, that is roughly the output of a large nuclear power plant dedicated entirely to AI training. It puts Thinking Machines in the same infrastructure tier as OpenAI, Google DeepMind, and Anthropic — companies that have spent years and tens of billions building out their compute capacity.

Why Nvidia Made This Bet

For Nvidia, the deal extends its dominance in the AI training market at a critical moment. With AMD's MI400 accelerators gaining traction and custom silicon from Google and Amazon maturing, locking in a high-profile customer like Thinking Machines reinforces the Vera Rubin platform's position as the default choice for frontier labs.

The investment also signals Nvidia's confidence in Murati's vision. Since leaving OpenAI in late 2024, Murati has assembled a team of top researchers and engineers, many recruited from OpenAI, Google DeepMind, and Meta FAIR. While the company has remained tight-lipped about its model architecture and product roadmap, the scale of compute it is now acquiring suggests ambitions well beyond a niche research lab.

Implications for the AI Landscape

The partnership reshapes the competitive dynamics of the frontier AI race. Thinking Machines now has a credible path to training models at a scale previously accessible only to a handful of hyperscalers and well-funded incumbents.

It also highlights the growing importance of compute partnerships as a strategic lever. Rather than building data centers from scratch, Thinking Machines is leveraging Nvidia's ecosystem to accelerate its timeline — a playbook that more startups may follow as the cost of frontier training runs continues to climb.

For the broader industry, the message is clear: the barrier to entry for frontier AI research is not just capital, but access to the right hardware at the right scale. Murati appears to have secured both.

Learn AI for Free — FreeAcademy.ai

Take "AI for Business: Practical Implementation" — a free course with certificate to master the skills behind this story.

More in Industry

Cerebras Files For IPO At $23B Valuation, Eyes May Nasdaq Debut
Industry

Cerebras Files For IPO At $23B Valuation, Eyes May Nasdaq Debut

Nvidia rival Cerebras Systems filed its long-delayed S-1 this weekend, setting up a mid-May Nasdaq listing on the back of a $10B+ OpenAI compute deal and $510M in 2025 revenue.

10 hours ago2 min read
Factory Hits $1.5B Valuation as AI Coding Droids Land at Nvidia, Morgan Stanley
Industry

Factory Hits $1.5B Valuation as AI Coding Droids Land at Nvidia, Morgan Stanley

Factory raised $150M Series C at a $1.5B valuation to scale its enterprise 'Droids'—AI agents that write, test, review, and deploy code for customers including Nvidia, Adobe, Morgan Stanley, and MongoDB.

16 hours ago2 min read
'Tokenmaxxing' Paradox: AI Coding Tools Boost Throughput 2x at 10x the Cost
Industry

'Tokenmaxxing' Paradox: AI Coding Tools Boost Throughput 2x at 10x the Cost

New data from Faros AI, Jellyfish, and Waydev reveals AI coding tools are inflating token budgets and code churn — developers accept more code, then revise it right back out.

19 hours ago2 min read