A new theoretical study explores neural network architectures that utilize the stationary states of dissipative Schrödinger-type dynamics on learned latent graphs. The research introduces a method for learning the graph structure itself by optimizing over weighted graphs with a specific metric to ensure well-posed natural-gradient descent. This framework establishes equivalences between multilayer stationary networks, global stationary problems, and other architectural types, offering complexity bounds based on sparse graph geometry rather than dense connectivity. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
IMPACT Introduces novel theoretical frameworks for neural network architecture, potentially influencing future model design and complexity analysis.
RANK_REASON This is a theoretical study published as an arXiv preprint.