Alibaba's Qwen team has released Qwen 3, a suite of models ranging from 0.6 billion to 235 billion parameters. These models, including both full and Mixture-of-Experts (MoE) variants, reportedly surpass previous versions like R1 and o1 in performance. The release offers a range of sizes to cater to different computational needs and applications. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
RANK_REASON Release of a new suite of open-source models from a major tech company's AI lab, with performance claims that warrant research-level classification.