PulseAugur
LIVE 09:01:38
research · [2 sources] ·
0
research

New research explores ensemble models for improved AI performance and robustness

Two new research papers introduce novel methods for improving ensemble models in machine learning. The first, PACE, combines pruning and compression techniques to create more efficient and interpretable ensembles, outperforming existing methods. The second, Perturb-and-Correct (P&C), uses post-hoc perturbations on a single pretrained network to generate diverse predictors that maintain agreement on calibration data while differing elsewhere. P&C demonstrates a strong trade-off between in-distribution and out-of-distribution performance. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT These papers explore techniques to enhance the efficiency and robustness of machine learning models, potentially leading to better performance in complex prediction tasks.

RANK_REASON Two academic papers published on arXiv present new methods for improving ensemble models.

Read on arXiv cs.LG →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 Français(FR) · Fabian Akkerman, Julien Ferry, Th\'eo Guyard, Thibaut Vidal ·

    PACE: Prune-And-Compress Ensemble Models

    arXiv:2605.06278v1 Announce Type: new Abstract: Ensemble models achieve state-of-the-art performance on prediction tasks, but usually require aggregating a large number of weak learners. This can hinder deployment, interpretability, and downstream tasks such as robustness verific…

  2. arXiv cs.LG TIER_1 · Eleanor Quint ·

    Perturb and Correct: Post-Hoc Ensembles using Affine Redundancy

    arXiv:2605.01632v1 Announce Type: new Abstract: Models that are indistinguishable on in-distribution data can behave very differently under distribution shift. We introduce Perturb-and-Correct (P&C), a post-hoc method for constructing epistemically diverse predictors from a s…