
AI2 Releases OLMo Hybrid: Combining Transformers and RNNs for 2x Data Efficiency
The Allen Institute for AI releases OLMo Hybrid, a fully open 7B model that blends transformer attention with linear recurrent layers, achieving the same accuracy as OLMo 3 using 49% fewer tokens.






