Six months after OpenAI launched Sora to global fanfare, Sam Altman killed it. Not with a blog post announcing a pivot, and not after a measured wind-down — but abruptly, with a compute bill that had become impossible to justify and a retention curve that had gone off a cliff.
According to a joint investigation published by TechCrunch and the Wall Street Journal on March 29, Sora was burning approximately $1 million per day in GPU compute at the time of its shutdown. User counts had peaked around 1 million active accounts before crashing to under 500,000 — a more than 50% collapse in the months after launch.
The Numbers That Killed Sora
The economics were straightforward, and brutal. At $1M/day, Sora was consuming roughly $365 million annually in compute alone — before accounting for engineering headcount, infrastructure, or any other overhead. OpenAI is not profitable. That kind of run rate for a product with collapsing user numbers was not defensible.
What made the math worse: the users who stayed weren't the high-value enterprise segment that justifies that kind of spend. Sora attracted hobbyists, students, and experimenters — not the film studios and content production houses that were supposed to anchor the business. The gap between the product's cultural splash and its commercial reality was enormous.
Sam Altman's decision to pull the plug was reportedly tied to a broader compute reallocation. OpenAI is fighting on multiple fronts — its GPT-5 series, real-time voice, coding tools — and every GPU is spoken for. Sora was taking up table space that other bets needed.
Disney Found Out 45 Minutes Before Everyone Else
The most damaging detail in the reporting isn't the compute cost. It's what happened to Disney.
Disney had committed $1 billion to a partnership built around Sora, expecting the video model to become a core part of its content production and licensing infrastructure. The partnership was a marquee deal — the kind OpenAI uses to signal that it's a serious enterprise player, not just a consumer app company.
According to sources cited in the WSJ piece, Disney executives received notification of Sora's shutdown less than one hour before the public announcement went live. There was no negotiation, no wind-down period, no formal process. A $1 billion commitment, and the counterparty got less than 60 minutes' notice.
The reputational implications of that timeline are significant. Enterprise customers making nine- and ten-figure commitments to OpenAI infrastructure now have a concrete data point about how those commitments get treated when internal priorities shift.
Claude Code Was Winning the Customers Sora Was Supposed To
The broader competitive context in the reporting points to Anthropic's Claude Code as a material factor in the shutdown decision. Not that Claude Code is a video product — it isn't — but that it has been systematically winning the enterprise software development customers that OpenAI expected to dominate through its developer tools and ecosystem.
OpenAI's coding-adjacent revenue was supposed to validate the compute spend on prestige products like Sora. When Claude Code started taking those customers, the internal calculus on what the GPU budget should prioritize shifted. Video generation became harder to justify when the enterprise software market — where margins are higher and retention stickier — was being ceded to a competitor.
The freed-up compute from Sora's shutdown is expected to go toward OpenAI's coding and agentic products. The company is essentially choosing to compete on the battlefield where the revenue actually is, rather than the battlefield where the press coverage was.
What This Means for Generative Video
Sora's failure is not evidence that generative video is a dead end. Runway, Pika, and Kling continue to operate and grow. Google's Veo 3 is in wide testing. The market exists.
What Sora's failure demonstrates is that a consumer-facing video product — even one with state-of-the-art quality — cannot survive on novelty alone. The cost structure of video generation is fundamentally different from text: higher per-token compute, larger memory requirements, slower inference. The unit economics require either very high prices, very high volume, or enterprise contracts that spread the cost across professional use cases.
OpenAI had none of those in sufficient quantity. The product was too expensive for casual users and not integrated enough into professional workflows to command enterprise prices. It occupied an expensive middle ground.
For the rest of the industry, the lesson is clear: generative video needs a workflow integration story, not just a quality story. The technology is impressive. The business model is the hard part.



