PulseAugur
LIVE 08:51:56
research · [1 source] ·
0
research

Contrast-Enhanced Gating in GRUs for Robust Low-Data Sequence Learning

Researchers have developed a new activation function called squared sigmoid-tanh (SST) designed to improve the performance of Gated Recurrent Units (GRUs) in sequence learning tasks, particularly when training data is limited. This parameter-free modification enhances the contrast between gate activations, leading to sharper information filtering and more stable learning. Evaluations across sign language recognition, human activity recognition, and time-series forecasting demonstrated that SST-GRUs consistently outperform standard GRUs, especially in data-scarce environments, with minimal added computational cost. AI

Summary written by None from 1 source. How we write summaries →

IMPACT Introduces a parameter-free modification to GRUs that improves performance in low-data sequence learning scenarios.

RANK_REASON This is a research paper introducing a novel activation function for recurrent neural networks.

Read on arXiv cs.LG →

COVERAGE [1]

  1. arXiv cs.LG TIER_1 · Barathi Subramanian, Rathinaraja Jeyaraj, Anand Paul ·

    Contrast-Enhanced Gating in GRUs for Robust Low-Data Sequence Learning

    arXiv:2402.09034v3 Announce Type: replace Abstract: Activation functions govern how recurrent networks regulate and transmit information across temporal dependencies. Despite advances in sequence modelling, gated recurrent units (GRUs) still depend on the standard sigmoid and tan…