PulseAugur
LIVE 07:16:20
ENTITY DP-SGD

DP-SGD

PulseAugur coverage of DP-SGD — every cluster mentioning DP-SGD across labs, papers, and developer communities, ranked by signal.

Total · 30d
4
4 over 90d
Releases · 30d
0
0 over 90d
Papers · 30d
4
4 over 90d
TIER MIX · 90D
SENTIMENT · 30D

3 day(s) with sentiment data

RECENT · PAGE 1/1 · 4 TOTAL
  1. RESEARCH · CL_30626 ·

    New theory bounds KAN training, reveals privacy-utility gap

    Researchers have established new theoretical bounds for training Kolmogorov-Arnold Networks (KANs), a structured alternative to standard MLPs. The work analyzes KANs trained with mini-batch stochastic gradient descent (…

  2. TOOL · CL_27491 ·

    New DP-LAC method enhances private federated LLM fine-tuning

    Researchers have developed DP-LAC, a new method for differentially private federated fine-tuning of language models. This technique improves upon existing adaptive clipping methods by estimating an initial clipping thre…

  3. RESEARCH · CL_21978 ·

    New DP-SGD subsampling methods offer improved privacy-utility trade-offs

    Two new research papers explore optimized subsampling techniques for Differentially Private Stochastic Gradient Descent (DP-SGD). The first paper, focusing on random shuffling, provides tight upper and lower bounds with…

  4. RESEARCH · CL_11743 ·

    Researchers reveal supply-chain attacks can steal secrets from local LLM fine-tuning

    Researchers have developed a novel method to steal sensitive information from locally fine-tuned large language models by exploiting vulnerabilities in their supply chain code. This technique moves beyond passive weight…