Neural tangent kernel
PulseAugur coverage of Neural tangent kernel — every cluster mentioning Neural tangent kernel across labs, papers, and developer communities, ranked by signal.
- 2026-05-13 research_milestone Publication of a paper introducing a force-aware Neural Tangent Kernel for active learning of MLIPs. source
1 day(s) with sentiment data
-
New framework enables scalable, robust active learning for MLIPs
Researchers have developed a new active learning framework for machine-learning interatomic potentials (MLIPs) that addresses scalability and robustness challenges. This framework utilizes a force-aware Neural Tangent K…
-
Paper explores preconditioned gradient descent's impact on neural network learning regimes
This paper investigates how preconditioned gradient descent (PGD) methods, like Gauss-Newton, influence spectral bias and the phenomenon of grokking in neural networks. Researchers propose that PGD can mitigate spectral…
-
New research explains why Zeroth-Order Optimization scales to LLMs
Two new papers explore zeroth-order (ZO) optimization for fine-tuning large language models (LLMs). The first paper introduces a kernel perspective, showing that the approximation error depends on output size rather tha…
-
New theories explore how pre-training and sparse connectivity enhance deep learning generalization
Three new papers explore the theoretical underpinnings of generalization in deep learning. One paper identifies pre-training as a critical factor for weak-to-strong generalization, demonstrating its emergence through a …