PulseAugur
LIVE 14:52:04
research · [2 sources] ·
0
research

Bayesian model selection via ELBO can overfit, cautioning practitioners

A new paper explores the relationship between the Evidence Lower Bound (ELBO) and Occam's Razor in Bayesian model selection. The research demonstrates that ELBO-based hyperparameter learning can lead to overfitting, contrary to the principle of Occam's Razor which favors simpler models. Surprisingly, Bayesian model selection using the evidence itself sometimes prefers the overfit model, while the ELBO does not. The findings suggest that practitioners should be cautious about how reduced-rank assumptions, necessary for tractability in large models, can impact model selection. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Highlights potential pitfalls in model selection for large Bayesian models, impacting practitioners in the field.

RANK_REASON Academic paper on a theoretical aspect of Bayesian inference and model selection.

Read on arXiv stat.ML →

COVERAGE [2]

  1. arXiv stat.ML TIER_1 · Ethan Harvey, Michael C. Hughes ·

    Occam's Razor is Only as Sharp as Your ELBO

    arXiv:2604.25984v1 Announce Type: new Abstract: The marginal likelihood, also known as the evidence, is regarded as a mathematical embodiment of Occam's razor, enabling model selection that avoids overfitting. The evidence lower bound (ELBO) objective from variational inference h…

  2. arXiv stat.ML TIER_1 · Michael C. Hughes ·

    Occam's Razor is Only as Sharp as Your ELBO

    The marginal likelihood, also known as the evidence, is regarded as a mathematical embodiment of Occam's razor, enabling model selection that avoids overfitting. The evidence lower bound (ELBO) objective from variational inference has also been used for similar purposes. Prior wo…