PulseAugur
LIVE 06:30:50
research · [2 sources] ·
0
research

New methods train neural networks with non-differentiable components

Researchers have developed new methods for training neural networks that incorporate non-differentiable components, a common challenge in areas like spiking neurons or quantized layers. One approach, detailed in an arXiv paper, uses a fixed-point formulation of optimal transport to avoid adversarial training and implicit differentiation, enabling stable and efficient training. Another method, called PolyStep, is a gradient-free optimizer that uses forward passes only, achieving state-of-the-art results on various non-differentiable architectures and outperforming existing gradient-free methods. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Enables training of more complex neural network architectures previously intractable due to non-differentiable components.

RANK_REASON The cluster contains two academic papers detailing novel methods for training neural networks.

Read on Hugging Face Daily Papers →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Samy Wu Fung ·

    Fixed-Point Neural Optimal Transport without Implicit Differentiation

    We propose an implicit neural formulation of optimal transport that eliminates adversarial min--max optimization and multi-network architectures commonly used in existing approaches. Our key idea is to parameterize a single potential in the Kantorovich dual and reformulate the as…

  2. Hugging Face Daily Papers TIER_1 ·

    Training Non-Differentiable Networks via Optimal Transport

    Neural networks increasingly embed non-differentiable components (spiking neurons, quantized layers, discrete routing, blackbox simulators, etc.) where backpropagation is inapplicable and surrogate gradients introduce bias. We present PolyStep, a gradient-free optimizer that upda…