PulseAugur
LIVE 06:23:57
research · [2 sources] ·
0
research

"This Wasn't Made for Me": ASR Bias Hurts Users Emotionally and Cognitively

A new research paper highlights the emotional and psychological toll of bias in Automatic Speech Recognition (ASR) systems. The study, which involved user experience research in four U.S. locations, found that participants often felt technologies failed to accommodate their cultural backgrounds. This led to feelings of inadequacy and frustration, as users performed significant invisible labor, such as code-switching and hyper-articulation, to make the systems functional. The paper argues that traditional accuracy metrics for algorithmic fairness overlook these critical dimensions of harm. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Highlights the need for ASR evaluation beyond accuracy metrics to include user experience and emotional impact.

RANK_REASON Academic paper on bias in ASR systems.

Read on arXiv cs.CL →

"This Wasn't Made for Me": ASR Bias Hurts Users Emotionally and Cognitively

COVERAGE [2]

  1. arXiv cs.CL TIER_1 · Siyu Liang, Alicia Beckford Wassink ·

    "This Wasn't Made for Me": Recentering User Experience and Emotional Impact in the Evaluation of ASR Bias

    arXiv:2604.21148v2 Announce Type: replace Abstract: Studies on bias in Automatic Speech Recognition (ASR) tend to focus on reporting error rates for speakers of underrepresented dialects, yet less research examines the human side of system bias: how do system failures shape users…

  2. arXiv cs.CL TIER_1 · Alicia Beckford Wassink ·

    "This Wasn't Made for Me": Recentering User Experience and Emotional Impact in the Evaluation of ASR Bias

    Studies on bias in Automatic Speech Recognition (ASR) tend to focus on reporting error rates for speakers of underrepresented dialects, yet less research examines the human side of system bias: how do system failures shape users' lived experiences, how do users feel about and rea…