Microsoft researchers will present 11 papers at the NSDI '26 conference, focusing on advances in large-scale networked systems relevant to cloud computing and AI. Several papers explore novel approaches to optimizing AI model performance and infrastructure, including techniques for sharing KV caches to boost LLM throughput and methods for automating model-based testing of network protocols. Other research highlights include enhancements to disaggregated memory pods for reduced cost and improved performance, and new strategies for traffic engineering in optical networks. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
IMPACT These advancements in AI systems and networking infrastructure could lead to more efficient and cost-effective deployment of large-scale AI models.
RANK_REASON The cluster contains multiple research papers presented at a scientific conference, detailing technical advancements in AI systems and networking. [lever_c_demoted from research: ic=1 ai=1.0]