PulseAugur
LIVE 01:46:39
ENTITY BABILong

BABILong

PulseAugur coverage of BABILong — every cluster mentioning BABILong across labs, papers, and developer communities, ranked by signal.

Total · 30d
4
4 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
4
4 over 90d
TIER MIX · 90D
SENTIMENT · 30D

1 day(s) with sentiment data

RECENT · PAGE 1/1 · 4 TOTAL
  1. TOOL · CL_27567 ·

    FocuSFT improves LLM long-context understanding via bilevel optimization

    Researchers have developed FocuSFT, a novel bilevel optimization framework designed to improve how large language models handle long contexts. This method addresses the issue of "attention dilution," where models tend t…

  2. TOOL · CL_22116 ·

    New paper proposes residual-mass accounting for partial-KV decoding

    Researchers have developed a novel method for partial-KV decoding, which optimizes the efficiency of large language models by only computing exact softmax contributions for a subset of tokens. This approach uses learned…

  3. TOOL · CL_16230 ·

    Q-RAG method enables efficient multi-step retrieval for LLMs up to 10M tokens

    Researchers have introduced Q-RAG, a novel method for enhancing Retrieval-Augmented Generation (RAG) systems. This approach utilizes reinforcement learning to fine-tune the embedder model for multi-step retrieval, a mor…

  4. RESEARCH · CL_11786 ·

    Understanding and Improving Length Generalization in Hierarchical Sparse Attention Models

    Researchers have identified three key design principles crucial for enhancing length generalization in hierarchical sparse attention models. These principles include using an expressive Chunk Encoder with a CLS token fo…