Tencent on Thursday open-sourced Hy3 Preview, a new flagship large language model and the first major release from the Hunyuan team since a leadership overhaul earlier this year. The model is a mixture-of-experts system with 295 billion total parameters, 21 billion of which are activated per token, and supports context windows of up to 256,000 tokens. Weights are available on Hugging Face.
The release is Tencent's most serious attempt yet to close the gap with domestic rivals such as DeepSeek, Alibaba's Qwen and Moonshot's Kimi, and to stop relying on outside labs inside its own products.
A three-month rebuild
According to reporting from South China Morning Post and Caixin, Hy3 Preview began training in late January 2026 and shipped in under three months. Tencent credits a February infrastructure overhaul led by chief AI scientist Yao Shunyu — a former OpenAI researcher the company brought on to lead its AI push — who pushed a full rebuild of the pretraining and reinforcement learning stack.
The 21-billion-activated footprint is the headline economic claim. With only a fraction of its weights lit up per token, Hy3 Preview is meant to be noticeably cheaper to serve than dense models of comparable capability, which matters for a company that wants to bake the model into consumer-scale chat and search products.
Yuanbao swaps out DeepSeek
The most concrete product signal is Yuanbao, Tencent's flagship AI chatbot. Yuanbao had been using DeepSeek as its primary underlying model while the in-house stack was being rebuilt. With Hy3 Preview, Tencent is switching Yuanbao's default engine to its own model and rolling Hy3 across its broader ecosystem.
On benchmarks, Tencent highlights a jump on SWE-bench Verified — a coding test built around real GitHub bug fixes — from 53% on the prior Hy2 generation to 74.4% on Hy3 Preview, a roughly 40% relative gain. That puts Hy3 in the same neighborhood as other top Chinese open-weight coders, though Tencent concedes the model still trails the best closed frontier systems from OpenAI and Google DeepMind.
Why it matters
Three things stand out. First, turnaround: a 295B MoE trained and shipped in under three months is a signal that Tencent's new AI leadership can move at Chinese-startup speed rather than at big-company speed. Second, independence: pulling DeepSeek out of Yuanbao removes a dependency on a model the company does not control and that is itself under geopolitical pressure. Third, distribution: Tencent has the rare advantage of shipping Hy3 into Yuanbao, WeChat-adjacent surfaces and enterprise tools from day one, which turns the open-source release into a real-world product, not just a leaderboard entry.
Hy3 Preview will not dislodge GPT-5.5 or Claude Opus 4.7 at the frontier, but inside China it meaningfully reshuffles the open-weight hierarchy — and it pulls one of the country's largest consumer AI surfaces back in-house.



