DeepSeek has released a preview of its V4-Pro model, an MoE architecture with 1.6 trillion parameters. This release is positioned as a competitor against models like OpenAI's GPT-5 and Anthropic's Opus 4.7. The models were benchmarked on a task involving three agents building a command-line interface in Rust. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
IMPACT Sets a new benchmark for MoE models, potentially influencing future large-scale model development and competition.
RANK_REASON Frontier-lab model release with system card. [lever_c_demoted from frontier_release: ic=1 ai=1.0]