PulseAugur
LIVE 13:06:03
research · [1 source] ·
0
research

Random neural network fluctuations exhibit phase transitions based on fixed points

Researchers have established central and non-central limit theorems for functionals derived from infinitely-wide random neural networks on a d-dimensional sphere. The study reveals that the asymptotic behavior of these functionals, as network depth increases, is critically dependent on the fixed points of the covariance function. This dependency leads to three distinct limiting regimes: convergence to a functional of a limiting Gaussian field, convergence to a Gaussian distribution, or convergence to a distribution within the Qth Wiener chaos. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Provides theoretical insights into the behavior of deep neural networks, potentially informing future architectural designs.

RANK_REASON Academic paper on theoretical aspects of random neural networks.

Read on arXiv stat.ML →

Random neural network fluctuations exhibit phase transitions based on fixed points

COVERAGE [1]

  1. arXiv stat.ML TIER_1 · Domenico Marinucci ·

    Phase Transitions in the Fluctuations of Functionals of Random Neural Networks

    We establish central and non-central limit theorems for sequences of functionals of the Gaussian output of an infinitely-wide random neural network on the d-dimensional sphere . We show that the asymptotic behaviour of these functionals as the depth of the network increases depen…