Researchers have analyzed the capacity limits of linear associative memory, finding that the retrieval criterion significantly impacts how many associations can be stored. For top-1 retrieval, where a signal must outperform all others, the memory size scales as $d^2 \asymp n \log n$. When considering listwise retrieval, which allows the correct target to be among a controlled list of strong candidates, the capacity scales quadratically as $d^2 \asymp n$. This work introduces the Tail-Average Margin (TAM) criterion to formalize listwise retrieval and develops an asymptotic theory for its performance. AI
Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →
IMPACT Provides theoretical insights into the capacity limits of memory systems, relevant for designing future AI architectures.
RANK_REASON This is a research paper detailing theoretical findings on associative memory capacity.