Cohere Labs has released TinyAya, a family of compact multilingual language models that support approximately 67 languages while being small enough to run on consumer hardware. The release includes both a base model (TinyAya-Base, 3.35B parameters) and instruction-tuned variants.
Why TinyAya Matters
Most large language models perform well in English and a handful of other high-resource languages, but quality drops significantly for the majority of the world's languages. TinyAya addresses this by specifically optimizing for balanced performance across a wide range of languages, including many that are typically underserved by AI systems.
Supported Language Families
The model covers languages across multiple families:
- European — Spanish, French, German, Portuguese, Polish, Czech, and others
- Asian — Chinese, Japanese, Korean, Hindi, Bengali, Thai, Vietnamese
- African — Swahili, Yoruba, Amharic, Hausa, Zulu
- Middle Eastern — Arabic, Turkish, Persian, Hebrew
- Indigenous and low-resource — Several languages with limited training data availability
Technical Details
| Spec | Value |
|---|---|
| Parameters | 3.35B |
| Languages | ~67 |
| Variants | Base, Instruction-tuned |
| Hardware requirement | Consumer GPU or CPU |
The small model size is deliberate — at 3.35B parameters, TinyAya can run on laptops, phones, and edge devices without requiring cloud infrastructure. This is critical for deployment in regions where internet connectivity is unreliable or expensive.
Benchmarks
Cohere published detailed benchmarks showing that TinyAya achieves competitive performance against much larger models on multilingual tasks, particularly for languages outside the top 10 by training data availability. The key metric is balance — rather than optimizing for peak English performance, the model aims for consistent quality across all supported languages.
Use Cases
Local Deployment
Organizations in developing markets can deploy TinyAya on-premises without cloud costs, making AI accessible in contexts where cloud-based solutions are impractical.
Education
The model can power educational tools in local languages, enabling AI-assisted learning for students who don't speak English or other dominant languages.
Government Services
Public sector organizations can use TinyAya to build multilingual chatbots and document processing systems that serve all citizens regardless of language.
Open Release
The model weights, benchmarks, and a detailed technical report are publicly available. Cohere Labs has released TinyAya under a permissive license to encourage adoption and community contributions, particularly from researchers working on underrepresented languages. It joins a wave of open multilingual models including Alibaba's Qwen3.5 with 201-language support and Zhipu AI's GLM-5 released under MIT license.


