PulseAugur
LIVE 14:33:35
research · [2 sources] ·
0
research

HuM-Eval framework improves video generation quality assessment

Researchers have developed HuM-Eval, a new framework designed to better evaluate the quality of human motion in generated videos. This system employs a coarse-to-fine strategy, first using a Vision Language Model for a broad assessment and then a detailed analysis of pose and motion stability. HuM-Eval reportedly achieves a 58.2% correlation with human judgment, surpassing existing methods. The team also introduced HuM-Bench, a benchmark dataset with 1,000 prompts, to aid in the evaluation of text-to-video models. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Improves evaluation metrics for human motion in generated videos, potentially guiding future text-to-video model development.

RANK_REASON The cluster describes a new academic paper detailing a novel evaluation framework for video generation models.

Read on arXiv cs.CV →

COVERAGE [2]

  1. arXiv cs.CV TIER_1 · Bingzi Zhang, Kaisi Guan, Ruihua Song ·

    HuM-Eval: A Coarse-to-Fine Framework for Human-Centric Video Evaluation

    arXiv:2604.25361v1 Announce Type: new Abstract: Video generation models have developed rapidly in recent years, where generating natural human motion plays a pivotal role. However, accurately evaluating the quality of generated human motion video remains a significant challenge. …

  2. arXiv cs.CV TIER_1 · Ruihua Song ·

    HuM-Eval: A Coarse-to-Fine Framework for Human-Centric Video Evaluation

    Video generation models have developed rapidly in recent years, where generating natural human motion plays a pivotal role. However, accurately evaluating the quality of generated human motion video remains a significant challenge. Existing evaluation metrics primarily focus on g…