PulseAugur
LIVE 12:23:48
research · [2 sources] ·
0
research

Linear memory capacity depends on retrieval: $n\log n$ for top-1, $n$ for listwise

Researchers have analyzed the capacity limits of linear associative memory, finding that the retrieval criterion significantly impacts how many associations can be stored. For top-1 retrieval, where a signal must outperform all others, the memory size scales as $d^2 \asymp n \log n$. When considering listwise retrieval, which allows the correct target to be among a controlled list of strong candidates, the capacity scales quadratically as $d^2 \asymp n$. This work introduces the Tail-Average Margin (TAM) criterion to formalize listwise retrieval and develops an asymptotic theory for its performance. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Provides theoretical insights into the capacity limits of memory systems, relevant for designing future AI architectures.

RANK_REASON This is a research paper detailing theoretical findings on associative memory capacity.

Read on arXiv stat.ML →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Nicholas Barnfield, Juno Kim, Eshaan Nichani, Jason D. Lee, Yue M. Lu ·

    Sharp Capacity Thresholds in Linear Associative Memory: From Winner-Take-All to Listwise Retrieval

    arXiv:2605.05189v1 Announce Type: cross Abstract: How many key-value associations can a $d\times d$ linear memory store? We show that the answer depends not only on the $d^2$ degrees of freedom in the memory matrix, but also on the retrieval criterion. In an isotropic Gaussian mo…

  2. arXiv stat.ML TIER_1 · Yue M. Lu ·

    Sharp Capacity Thresholds in Linear Associative Memory: From Winner-Take-All to Listwise Retrieval

    How many key-value associations can a $d\times d$ linear memory store? We show that the answer depends not only on the $d^2$ degrees of freedom in the memory matrix, but also on the retrieval criterion. In an isotropic Gaussian model for the stored pairs, we show that top-1 retri…