Wasserstein metric
PulseAugur coverage of Wasserstein metric — every cluster mentioning Wasserstein metric across labs, papers, and developer communities, ranked by signal.
No coverage in the last 90 days.
1 day(s) with sentiment data
-
New GANICE method advances causal inference with Wasserstein distance
Researchers have introduced GANICE, a new method for distributional causal inference that utilizes Generative Adversarial Networks (GANs) to estimate interventional outcome distributions. This approach addresses limitat…
-
New DP sampling method uses Wasserstein distance
Researchers have introduced a new framework for differentially private sampling from distributions, utilizing Wasserstein distance as the primary utility measure. This approach addresses limitations of prior methods tha…
-
CoMemNet improves continual traffic prediction with memory replay and contrastive sampling
Researchers have introduced CoMemNet, a novel dual-branch continual learning framework designed for traffic prediction in dynamic, evolving networks. This system employs an Online branch for immediate predictions and a …
-
New research analyzes full-graph vs. mini-batch GNN training
This paper presents a comprehensive analysis comparing full-graph and mini-batch training for Graph Neural Networks (GNNs). It explores the impact of batch size and fan-out size on GNN convergence and generalization, of…
-
New Fisher Decorator method refines offline RL policies with local transport maps
Researchers have developed a new method called Fisher Decorator to improve flow-based offline reinforcement learning. This approach addresses limitations in existing methods by using a local transport map to refine poli…
-
New methods enhance conformal prediction for uncertainty quantification
Researchers have developed novel methods for conformal prediction, a technique used for uncertainty quantification in machine learning. The first approach utilizes a differentiable nonconformity score to create a flow o…
-
Deep Neural Networks Achieve Universality via Lindeberg Exchange Principle
Researchers have developed a new approach to understand the behavior of deep neural networks in their infinite-width limit. By applying a Lindeberg principle specifically adapted for deep neural networks, they can quant…
-
New Bayes Posterior Sampling Method Enhances Large-Data Mixed Models
Researchers have developed a novel stochastic mirror Langevin dynamics algorithm designed for fitting Bayesian generalized linear mixed models to large datasets. This new method addresses limitations in existing stochas…
-
New method uses hidden states to improve AI reasoning credit assignment
Researchers have developed a new method called Span-level Hidden state Enabled Advantage Reweighting (SHEAR) to improve credit assignment in reinforcement learning for language models. SHEAR leverages the Wasserstein di…
-
New Bayesian design framework improves experimental efficiency using integral probability metrics
Researchers have developed a new Bayesian Optimal Experimental Design (BOED) framework that utilizes integral probability metrics (IPMs) to enhance stability and accuracy. This approach replaces traditional Kullback-Lei…