PulseAugur
LIVE 06:26:09
tool · [1 source] ·
0
tool

New theory resolves instability in MeanFlow generative models

Researchers have developed a theoretical framework to address instability issues in MeanFlow training, a one-step generative modeling technique. They identified that the conditional velocity field is misused in the loss function, playing two statistical roles with an incorrect coefficient. The study derives the optimal coefficient, demonstrating that various concurrent fixes are practical implementations of this optimum. Applying this optimized coefficient significantly improves sample quality and leads to a monotonic FID trend in latent Diffusion Transformers. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT Introduces a theoretical fix for instability in MeanFlow training, potentially improving sample quality and model performance in generative AI.

RANK_REASON The cluster contains an academic paper detailing theoretical advancements and experimental validation in a specific machine learning technique. [lever_c_demoted from research: ic=1 ai=1.0]

Read on arXiv stat.ML →

COVERAGE [1]

  1. arXiv stat.ML TIER_1 · Ziran Wang ·

    On Variance Reduction in Learning Mean Flows

    One-step generative modeling has emerged as a leading approach to amortize the inference cost of diffusion and flow-matching models. Among distillation-free methods, MeanFlow training is notoriously unstable, with non-decreasing loss and unbounded gradient variance. In this work,…