Back to stories
Models

Zhipu AI Releases GLM-5: A 744B Parameter Model Under MIT License

Michael Ouroumis2 min read
Zhipu AI Releases GLM-5: A 744B Parameter Model Under MIT License

Chinese AI lab Zhipu AI has released GLM-5, a 744-billion parameter mixture-of-experts (MoE) model, under the MIT license. The model uses 44 billion active parameters per forward pass, features a 200,000-token context window, and scores an impressive 77.8% on SWE-bench Verified.

Model Specifications

SpecValue
Total parameters744B
Active parameters44B (MoE)
Context window200K tokens
SWE-bench Verified77.8%
Training hardwareHuawei Ascend
LicenseMIT

Why This Matters

Open-Source Under MIT

The MIT license is the most permissive available — anyone can use, modify, and commercialize the model without restrictions. This is a stronger commitment to open-source than Meta's Llama license, which includes usage-based restrictions for large companies.

Trained on Huawei Ascend

GLM-5 was trained entirely on Huawei's Ascend AI chips rather than NVIDIA GPUs. This is significant because US export controls have restricted China's access to advanced NVIDIA hardware. GLM-5's strong benchmark performance demonstrates that the Chinese AI ecosystem can produce competitive models despite these restrictions.

SWE-bench Performance

The 77.8% score on SWE-bench Verified — a benchmark that tests a model's ability to solve real-world software engineering tasks from GitHub issues — places GLM-5 among the top-performing models on this widely-watched benchmark, alongside tools like Moonshot's Kimi Code which scored 62% on the same test.

The Chinese Open-Source Wave

GLM-5 is the latest in a series of strong open-source releases from Chinese AI labs:

This trend is creating a two-track open-source ecosystem, with both Western labs (Meta, Mistral) and Chinese labs (Zhipu, DeepSeek, Alibaba) producing frontier-quality open models. Alibaba's Qwen3.5, for example, offers 201-language support with an agent-first architecture.

Availability

The model weights, tokenizer, and training documentation are available on Hugging Face and ModelScope. Zhipu has also published a technical report detailing the architecture, training process, and evaluation methodology.

Community members have already begun creating quantized versions for consumer hardware, though the full model requires significant compute resources to run.

Learn AI for Free — FreeAcademy.ai

Take "AI Essentials: Understanding AI in 2026" — a free course with certificate to master the skills behind this story.

More in Models

NVIDIA Launches Ising: Open-Source AI Models to Make Quantum Computers Useful
Models

NVIDIA Launches Ising: Open-Source AI Models to Make Quantum Computers Useful

NVIDIA unveiled Ising, its first family of open-source AI models for quantum computing, promising 2.5x faster error correction and slashing calibration time from days to hours.

2 days ago2 min read
OpenAI Retires Six Older Codex Models Including GPT-5 and GPT-5.1
Models

OpenAI Retires Six Older Codex Models Including GPT-5 and GPT-5.1

OpenAI today removes six legacy Codex models from its ChatGPT sign-in flow, consolidating around the newer GPT-5.3 and GPT-5.4 families and nudging developers toward API-based workflows.

2 days ago2 min read
GLM-5.1 Cracks Code Arena Top 3, First Open-Weight Model to Do So
Models

GLM-5.1 Cracks Code Arena Top 3, First Open-Weight Model to Do So

Z.ai's GLM-5.1 posted a 1530 Elo score on Code Arena this week, becoming the first open-weight model to break into the global top three — trailing only Anthropic's Claude Opus 4.6 variants.

4 days ago2 min read