PulseAugur
LIVE 06:25:10
research · [2 sources] ·
0
research

Jordan-RoPE: Non-Semisimple Relative Positional Encoding via Complex Jordan Blocks

Researchers have introduced Jordan-RoPE, a novel relative positional encoding method for transformer models that utilizes complex Jordan blocks. This approach generates oscillatory-polynomial features, enabling a distance-modulated phase basis that differs from existing methods like RoPE and ALiBi. While a scaled-exact variant showed improvement over baselines on a WikiText-103 language model, RoPE+ALiBi still performed strongest overall, indicating the structural benefits of Jordan-RoPE for specific tasks. AI

Summary written by None from 2 sources. How we write summaries →

IMPACT Introduces a new positional encoding technique that may offer advantages for specific language modeling tasks involving distance-modulated phase interactions.

RANK_REASON This is a research paper detailing a new method for positional encoding in transformer models.

Read on arXiv cs.LG →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Yaobo Zhang ·

    Jordan-RoPE: Non-Semisimple Relative Positional Encoding via Complex Jordan Blocks

    arXiv:2605.04217v1 Announce Type: new Abstract: Relative positional encodings determine which functions of query-key lag can enter the primitive attention logit. RoPE supplies a rotary phase, while ALiBi supplies an additive distance bias. Motivated by group-theoretic views of li…

  2. arXiv cs.CL TIER_1 · Yaobo Zhang ·

    Jordan-RoPE: Non-Semisimple Relative Positional Encoding via Complex Jordan Blocks

    Relative positional encodings determine which functions of query-key lag can enter the primitive attention logit. RoPE supplies a rotary phase, while ALiBi supplies an additive distance bias. Motivated by group-theoretic views of linear translation-invariant positional encodings,…