PulseAugur
LIVE 06:07:35
research · [2 sources] ·
0
research

Researchers find most ReLU networks have identifiable parameters

A new paper explores the realization map of deep ReLU networks, investigating when a function uniquely determines its parameters, accounting for scaling and permutation symmetries. The research introduces a framework using weighted polyhedral complexes to analyze hidden redundancies. A key finding is that for architectures with input and hidden layers of width at least two, an open set of identifiable parameters exists, implying the functional dimension equals the parameter count minus hidden neurons. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Provides theoretical insights into the parameter identifiability of deep ReLU networks, potentially influencing future architectural designs and optimization strategies.

RANK_REASON The cluster contains an academic paper published on arXiv.

Read on arXiv cs.LG →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Moritz Grillo, Guido Mont\'ufar ·

    Most ReLU Networks Admit Identifiable Parameters

    arXiv:2605.03601v1 Announce Type: new Abstract: We study the realization map of deep ReLU networks, focusing on when a function determines its parameters up to scaling and permutation. To analyze hidden redundancies beyond these standard symmetries, we introduce a framework based…

  2. arXiv cs.LG TIER_1 · Guido Montúfar ·

    Most ReLU Networks Admit Identifiable Parameters

    We study the realization map of deep ReLU networks, focusing on when a function determines its parameters up to scaling and permutation. To analyze hidden redundancies beyond these standard symmetries, we introduce a framework based on weighted polyhedral complexes. Our main resu…