OpenAI has agreed to spend more than $20 billion over the next three years on servers powered by Cerebras Systems chips and will receive an equity stake in the startup as part of the deal, according to reporting by The Information and follow-up coverage on April 16, 2026. The agreement lands as Cerebras prepares to refile paperwork for its long-delayed initial public offering, potentially repricing one of the highest-profile non-Nvidia AI chip companies.
A second, much larger commitment
The new contract builds on a January 2026 agreement in which OpenAI committed to buying up to 750 megawatts of compute capacity from Cerebras in a deal valued at over $10 billion. The latest expansion roughly doubles that commitment and pairs it with an equity component. According to the reports, OpenAI will also pay approximately $1 billion to help fund the construction of data centers that will run its AI models on Cerebras hardware.
The equity structure takes the form of warrants rather than immediate shares. OpenAI is said to receive warrants for a minority portion of Cerebras stock, with the potential to grow to up to 10% of the company over the three-year term, contingent on additional spending milestones. That structure mirrors the stake-for-compute arrangements that have become common across the AI industry as hyperscalers tie chip and cloud suppliers more tightly to their capacity roadmaps.
IPO refile at a $35 billion valuation
Perhaps the most immediate consequence is for the Cerebras IPO itself. The company, which first filed to go public in 2024 and saw its process stall amid regulatory review, is reportedly preparing to refile paperwork as soon as this week. The Information says Cerebras is targeting a raise of roughly $3 billion at a valuation of about $35 billion — a dramatic step up from prior reported valuations.
An anchor compute contract of this size from OpenAI gives Cerebras a forward revenue story that public-market investors can price against Nvidia's dominant data-center business, and it provides a counterweight to concerns about concentration risk and wafer-scale manufacturing costs.
Why OpenAI keeps stacking chip deals
For OpenAI, the Cerebras commitment is the latest in a string of multibillion-dollar infrastructure arrangements — stretching across Nvidia, AMD, Broadcom, Google TPUs, and custom silicon efforts — aimed at securing enough compute to train and serve frontier models through the back half of the decade. By taking equity alongside the compute, OpenAI reduces the risk that a key supplier gets bought out from under it or redirects capacity elsewhere.
The move also reflects a broader diversification push away from sole reliance on the Nvidia-TSMC supply chain, which has faced capacity constraints and geopolitical pressure. Cerebras' wafer-scale architecture is particularly attractive for large-context inference workloads, an area where OpenAI has been investing as agentic and long-running tasks grow in importance.
Implications
If the IPO prices near the reported $35 billion target, Cerebras would become one of the largest pure-play AI chip companies on public markets. It would also test investor appetite for AI-infrastructure IPOs at a moment when public sentiment toward AI and data-center spending is mixed. For OpenAI, the deal further entrenches a capital structure in which the company's compute suppliers are also its shareholders — and vice versa.



