PulseAugur
LIVE 09:07:15
tool · [1 source] ·
0
tool

Phase-Coherent Transformer advances complex-valued neural networks

Researchers have developed a new neural network architecture called the Phase-Coherent Transformer (PCT). This model modifies the attention mechanism of standard Transformers to better preserve phase information across layers, which is crucial for certain types of computation. Experiments show that PCT outperforms existing real-valued and complex-valued Transformers on various benchmarks, including those involving long-range memory and reasoning, without suffering from accuracy collapse at greater depths. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Introduces a novel architecture that improves generalization in complex-valued Transformers, potentially impacting future model designs for tasks requiring phase-sensitive computations.

RANK_REASON The cluster contains a new academic paper detailing a novel model architecture. [lever_c_demoted from research: ic=1 ai=1.0]

Read on arXiv cs.LG →

COVERAGE [1]

  1. arXiv cs.LG TIER_1 · Leona Hioki ·

    Complex-Valued Phase-Coherent Transformer

    Complex-valued Transformers have largely inherited softmax attention from real-valued architectures. However, row-normalised token competition is not necessarily aligned with phase-preserving computation. In this paper, we introduce the Phase-Coherent Transformer (PCT), which app…