Smol AI has released Kimi K2, an open Mixture-of-Experts (MoE) model. This model demonstrates the capability to scale up to 15 trillion tokens and 1 trillion parameters. The release highlights advancements in open-source large language model development. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
RANK_REASON Release of an open-source model from a non-frontier lab.