PulseAugur
LIVE 16:28:02
tool · [1 source] ·
0
tool

EduGage dataset enables sensor-based engagement assessment in video learning

Researchers have developed EduGage, a system that uses wearable and camera-based sensors to assess learner engagement during self-guided video learning. The system collects physiological and motion data, such as heart rate and eye-tracking, to estimate engagement levels. In a study with 16 participants, EduGage achieved promising accuracy in engagement estimation, outperforming various baseline models. The team is releasing the EduGage dataset to facilitate further research in this area. AI

Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →

IMPACT This research could lead to more adaptive and personalized online learning experiences by enabling systems to better understand learner engagement.

RANK_REASON This is a research paper detailing a new method and dataset for assessing learner engagement using sensor data. [lever_c_demoted from research: ic=1 ai=0.4]

Read on arXiv cs.CV →

COVERAGE [1]

  1. arXiv cs.CV TIER_1 · Zikang Leng, Edan Eyal, Yingtian Shi, Jiaman He, Yaqi Liu, Thomas Pl\"otz ·

    EduGage: Methods and Dataset for Sensor-Based Momentary Assessment of Engagement in Self-Guided Video Learning

    arXiv:2605.01238v1 Announce Type: cross Abstract: Engagement, which links to attentional, emotional, and cognitive dimensions, plays an important role in learning. In online and video-based learning environments, learners often need to regulate their own interactions with instruc…