PulseAugur
LIVE 13:06:53
research · [1 source] ·
0
research

Neural Garbage Collection learns to forget while reasoning to compress KV cache

Researchers have developed a novel technique called Neural Garbage Collection (NGC) that enables language models to learn how to manage their own memory while reasoning. This method allows the model to decide which parts of its memory (KV cache) to evict during the reasoning process, optimizing efficiency without human intervention. NGC was trained solely on task outcome rewards and demonstrated significant KV cache compression, maintaining high accuracy on complex reasoning tasks like Countdown and AIME. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

RANK_REASON The cluster describes a new academic paper detailing a novel technique for language models.

Read on Hugging Face Daily Papers →

Neural Garbage Collection learns to forget while reasoning to compress KV cache

COVERAGE [1]

  1. Hugging Face Daily Papers TIER_1 ·

    Neural Garbage Collection: Learning to Forget while Learning to Reason

    Chain-of-thought reasoning has driven striking advances in language model capability, yet every reasoning step grows the KV cache, creating a bottleneck to scaling this paradigm further. Current approaches manage these constraints on the model's behalf using hand-designed criteri…