PulseAugur
LIVE 00:51:42
ENTITY WikiText-2

WikiText-2

PulseAugur coverage of WikiText-2 — every cluster mentioning WikiText-2 across labs, papers, and developer communities, ranked by signal.

Total · 30d
4
4 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
4
4 over 90d
TIER MIX · 90D
SENTIMENT · 30D

1 day(s) with sentiment data

RECENT · PAGE 1/1 · 4 TOTAL
  1. TOOL · CL_28353 ·

    New BCJR-QAT method pushes LLM quantization to 2 bits per weight

    Researchers have developed BCJR-QAT, a novel method for quantizing large language models to 2 bits per weight, a significant advancement beyond current post-training quantization techniques. This new approach uses a dif…

  2. RESEARCH · CL_21794 ·

    New parameter E predicts Mixture-of-Experts model health, preventing dead experts.

    Researchers have introduced a new dimensionless control parameter, E = T*H/(O+B), to predict the health of expert ecologies in Mixture-of-Experts (MoE) models. This parameter, derived from four hyperparameters, can prev…

  3. TOOL · CL_20375 ·

    New MetaAdamW optimizer uses self-attention for adaptive learning rates

    Researchers have developed MetaAdamW, a novel optimizer that enhances adaptive learning rates and weight decay by employing a self-attention mechanism. This Transformer-based approach dynamically adjusts hyperparameters…

  4. RESEARCH · CL_10083 ·

    Associative-State Universal Transformers improve parameter efficiency with sparse retrieval

    Researchers have developed UniMatrix, a novel Universal Transformer architecture that integrates structured recurrence with sparse retrieval mechanisms. While initial versions showed parameter efficiency and competitive…