Smol AI has released Smaug-72B, a new large language model. This model is notable for its performance on various benchmarks, including achieving state-of-the-art results on the MT-Bench leaderboard. Smaug-72B was trained on a dataset of 1.5 trillion tokens and is available for research purposes. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
RANK_REASON Release of a 72B parameter model with benchmark results, not from a tier-1 lab.