PulseAugur
LIVE 08:51:39
research · [3 sources] ·
2
research

New theory bounds KAN training, reveals privacy-utility gap

Researchers have established new theoretical bounds for training Kolmogorov-Arnold Networks (KANs), a structured alternative to standard MLPs. The work analyzes KANs trained with mini-batch stochastic gradient descent (SGD), including differentially private variants with correlated noise. These findings reveal a gap between non-private and private training regimes, suggesting that polylogarithmic network width is necessary for differential privacy. AI

Summary written by gemini-2.5-flash-lite from 3 sources. How we write summaries →

IMPACT Establishes theoretical underpinnings for KANs, potentially guiding future research in privacy-preserving machine learning.

RANK_REASON The cluster contains two academic papers detailing theoretical analysis and bounds for a specific type of neural network architecture (KANs) and its training dynamics, including privacy considerations.

Read on arXiv stat.ML →

COVERAGE [3]

  1. arXiv stat.ML TIER_1 · Puyu Wang, Jan Schuchardt, Nikita Kalinin, Junyu Zhou, Sophie Fellenz, Christoph Lampert, Marius Kloft ·

    Population Risk Bounds for Kolmogorov-Arnold Networks Trained by DP-SGD with Correlated Noise

    arXiv:2605.12648v1 Announce Type: cross Abstract: We establish the first population risk bounds for Kolmogorov-Arnold Networks (KANs) trained by mini-batch SGD with gradient clipping, covering non-private SGD as well as differentially private SGD (DP-SGD) with Gaussian perturbati…

  2. arXiv stat.ML TIER_1 · Puyu Wang, Junyu Zhou, Philipp Liznerski, Marius Kloft ·

    Optimization, Generalization and Differential Privacy Bounds for Gradient Descent on Kolmogorov-Arnold Networks

    arXiv:2601.22409v3 Announce Type: replace-cross Abstract: Kolmogorov--Arnold Networks (KANs) have recently emerged as a structured alternative to standard MLPs, yet a principled theory for their training dynamics, generalization, and privacy properties remains limited. In this pa…

  3. arXiv stat.ML TIER_1 · Marius Kloft ·

    Population Risk Bounds for Kolmogorov-Arnold Networks Trained by DP-SGD with Correlated Noise

    We establish the first population risk bounds for Kolmogorov-Arnold Networks (KANs) trained by mini-batch SGD with gradient clipping, covering non-private SGD as well as differentially private SGD (DP-SGD) with Gaussian perturbations that interpolate between independent and tempo…