PulseAugur
LIVE 00:56:34
ENTITY Radio ffn

Radio ffn

PulseAugur coverage of Radio ffn — every cluster mentioning Radio ffn across labs, papers, and developer communities, ranked by signal.

Total · 30d
0
0 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
0
0 over 90d
TIER MIX · 90D

No coverage in the last 90 days.

RECENT · PAGE 1/1 · 6 TOTAL
  1. RESEARCH · CL_22053 ·

    DomLoRA method places single adapter at dominant module for efficient fine-tuning

    Researchers have developed a new method called DomLoRA for parameter-efficient fine-tuning of large language models. This technique identifies a single "dominant adaptation module" within a model where placing a low-ran…

  2. RESEARCH · CL_15880 ·

    New meta-learning LLM uses hypernetwork for adaptive textual conditioning

    Researchers have developed a novel meta-learning approach for Large Language Models (LLMs) that addresses issues of corpus heterogeneity and condition changes. This method utilizes a hypernetwork to dynamically generate…

  3. RESEARCH · CL_14144 ·

    State Stream Transformer V2 enhances LLM reasoning with parallel training and latent state streaming

    Researchers have developed the State Stream Transformer (SST) V2, an architectural innovation designed to enhance latent space reasoning in language models. Unlike standard transformers that reset context at each step, …

  4. RESEARCH · CL_11458 ·

    New diagnostic tool probes LLM circuits for safety and behavior insights

    A new research paper introduces "Perturbation Probing," a diagnostic method for understanding the internal workings of large language models. This technique uses two forward passes per prompt to identify and analyze "be…

  5. RESEARCH · CL_06296 ·

    Graph Memory Transformer replaces FFNs with learned memory graphs for interpretability

    Researchers have developed a Graph Memory Transformer (GMT) that replaces the standard Feed-Forward Network (FFN) sublayer in decoder-only language models with an explicit learned memory graph. This new architecture, GM…

  6. RESEARCH · CL_12995 ·

    Hugging Face introduces Graph Memory Transformer replacing FFNs with learned memory graphs

    Researchers have developed a Graph Memory Transformer (GMT) that replaces the standard Feed-Forward Network (FFN) sublayer in decoder-only transformers with an explicit learned memory graph. This new architecture mainta…