PulseAugur
LIVE 14:50:40
research · [1 source] ·
0
research

AI21 Labs open sources Jamba, a hybrid Mamba-attention LLM

AI21 Labs has open-sourced Jamba, a new model that integrates Mamba's efficient non-transformer architecture with traditional attention layers. This hybrid approach aims to deliver high performance while maintaining efficiency. The release was discussed by AI21 co-founder Yoav Shoham on the Practical AI podcast. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

RANK_REASON Open-source model release from a non-frontier lab.

Read on Practical AI →

AI21 Labs open sources Jamba, a hybrid Mamba-attention LLM

COVERAGE [1]

  1. Practical AI TIER_1 · Practical AI LLC ·

    Mamba & Jamba

    <p>First there was Mamba… now there is Jamba from AI21. This is a model that combines the best non-transformer goodness of Mamba with good ‘ol attention layers. This results in a highly performant and efficient model that AI21 has open sourced! We hear all about it (along with a …