PulseAugur
LIVE 08:29:14
research · [2 sources] ·
0
research

Researchers develop Evolutionary Dynamic Loss for distribution-free pretraining

Researchers have developed a new framework called Evolutionary Dynamic Loss (EDL) for pretraining classification losses. EDL learns a transferable loss function using synthetic data, avoiding the need for real samples during the main pretraining phase. The framework optimizes the loss as a lightweight network through an evolutionary strategy, incorporating chaotic mutation to enhance exploration and improve convergence. Experiments on CIFAR-10 demonstrated that EDL can effectively replace cross-entropy and achieve comparable or better accuracy. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Introduces a novel method for training classification losses that could improve model performance and generalization.

RANK_REASON This is a research paper detailing a new framework for classification losses.

Read on arXiv cs.LG →

COVERAGE [2]

  1. arXiv cs.LG TIER_1 · Meng Xiang, Yan Pei ·

    Distribution-Free Pretraining of Classification Losses via Evolutionary Dynamics

    arXiv:2605.03722v1 Announce Type: new Abstract: We propose Evolutionary Dynamic Loss (EDL), a framework that learns a transferable classification loss in the probability space using unlimited synthetic prediction-label pairs, without accessing real samples during the main loss pr…

  2. arXiv cs.LG TIER_1 · Yan Pei ·

    Distribution-Free Pretraining of Classification Losses via Evolutionary Dynamics

    We propose Evolutionary Dynamic Loss (EDL), a framework that learns a transferable classification loss in the probability space using unlimited synthetic prediction-label pairs, without accessing real samples during the main loss pretraining stage. EDL parameterizes the loss as a…