Meta has released Llama 4 Maverick, a 400-billion parameter mixture-of-experts model that the company says matches GPT-5 on reasoning and coding benchmarks. The model is available immediately under Meta's open weights license, free for commercial use — making it the most capable openly available AI model in the world.
Architecture
Maverick uses a mixture-of-experts (MoE) architecture with 128 expert modules, of which 12 are activated per token. This means only 52 billion parameters are used during any given inference call, making the model significantly more efficient than a dense 400B model while maintaining the knowledge capacity of the full parameter count.
The model supports a 256,000-token context window and native multimodal input including text, images, and video. It was trained on approximately 30 trillion tokens of text and 2 billion image-text pairs using Meta's Grand Teton AI infrastructure.
Benchmark Performance
Meta published results showing Maverick matching GPT-5 on MMLU-Pro (90.8% vs 91.4%), exceeding it on multilingual reasoning tasks, and scoring competitively on SWE-bench Verified (66.2% vs 68.1%). On mathematical reasoning benchmarks, Maverick scores 87.3% on MATH-500, within two points of the best proprietary models.
The model's strongest area is multilingual capability. Meta says Maverick was trained with particular emphasis on 24 languages beyond English and outperforms all proprietary models on multilingual benchmarks.
Open Weights License
Maverick is released under the same permissive license as previous Llama models. Companies can use it commercially, fine-tune it on proprietary data, and distribute modified versions. The license includes a usage threshold — companies with over 700 million monthly active users must request a separate license from Meta.
Meta also released Maverick-Mini, a distilled 70-billion parameter dense model derived from the full Maverick. This version runs on a single high-end GPU and scores within 10% of the full model on most benchmarks, making it practical for startups and individual developers.
Why Meta Does This
Meta's open-weights strategy serves its business interests even though the models are free. Every company that builds on Llama instead of paying OpenAI or Google is a company that is not locked into a competitor's ecosystem. Open models also drive adoption of Meta's PyTorch framework and its custom AI hardware.
"Open AI models are infrastructure, not products," said Mark Zuckerberg in a post announcing the release. "The more people build on Llama, the better the entire ecosystem gets — including for Meta."
Community Response
The AI developer community has responded enthusiastically. Within hours of release, quantized versions optimized for consumer hardware appeared on Hugging Face. Several hosting providers including Together AI, Fireworks, and Groq announced Maverick API access at prices undercutting GPT-5 by 70% or more.
The release puts renewed pressure on OpenAI, Anthropic, and Google to justify premium pricing for proprietary models. If an open model can match frontier performance, the value proposition of closed APIs shifts from capability to reliability, support, and ecosystem integration.



