PulseAugur
LIVE 07:10:26
frontier release · [1 source] ·
0
frontier release

Kimi K2 model boasts 1T parameters and SOTA HLE, while Soumith Chintala departs PyTorch

Kimi K2, a new model from Kimi, boasts 1 trillion parameters and achieves state-of-the-art results on the HLE benchmark. It also demonstrates capabilities in BrowseComp and TauBench. Separately, Soumith Chintala has departed from PyTorch. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Sets new SOTA on HLE benchmark, potentially influencing future model development.

RANK_REASON New model release with significant parameter count and benchmark performance.

Read on Smol AINews →

COVERAGE [1]

  1. Smol AINews TIER_1 ·

    Kimi K2 Thinking: 1T-A32B params, SOTA HLE, BrowseComp, TauBench && Soumith leaves Pytorch

    **Moonshot AI** launched **Kimi K2 Thinking**, a **1 trillion parameter** mixture-of-experts (MoE) model with **32 billion active experts**, a **256K context window**, and native **INT4 quantization-aware training**. It achieves state-of-the-art results on benchmarks like **HLE (…