The StarCoder2 family of models has been released, featuring three distinct sizes: 3B, 7B, and 15B parameters. These models were trained on The Stack v2 dataset, which comprises over 600 programming languages. Developed collaboratively by Hugging Face, ServiceNow, and NVIDIA, StarCoder2 aims to advance code generation capabilities. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
RANK_REASON Release of a new family of code-generation models by multiple labs, trained on a large dataset.