Researchers have developed a new framework called CorrDP that modifies differential privacy to account for feature correlations. This approach allows for relaxed privacy constraints on insensitive features, even if they are correlated with sensitive ones, by quantifying these correlations using total variation distance. The framework includes algorithms for differentially private empirical risk minimization (DP-ERM) that use distance-dependent noise in gradients, offering improved theoretical utility guarantees. Experiments on synthetic and real-world data demonstrate that CorrDP-based DP-ERM outperforms standard DP methods when insensitive features are present. AI
Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →
IMPACT Introduces a more nuanced approach to differential privacy, potentially improving utility in machine learning models with correlated features.
RANK_REASON Academic paper introducing a new framework for differential privacy.