PulseAugur
LIVE 14:51:57
research · [2 sources] ·
0
research

Researchers release Inter-Stance, a 20TB multimodal corpus for conversational stance analysis

Researchers have introduced the Inter-Stance corpus, a new multimodal dataset designed for analyzing conversational stance in social interactions. The corpus comprises recordings from 45 dyads, capturing synchronized data across various modalities including video, thermal imaging, voice, and physiological signals. Annotations within the dataset cover social signals, agreement, disagreement, and neutral stance, enabling advanced modeling of interpersonal behavior. AI

Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →

IMPACT Enables new research into multimodal interpersonal behavior modeling.

RANK_REASON The cluster describes a new academic paper introducing a novel dataset for research purposes.

Read on arXiv cs.CV →

COVERAGE [2]

  1. arXiv cs.CV TIER_1 · Xiang Zhang, Xiaotian Li, Taoyue Wang, Nan Bi, Xin Zhou, Cody Zhou, Zoie Wang, Andrew Yang, Yuming Su, Jeff Cohn, Qiang Ji, Lijun Yin ·

    Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Analysis

    arXiv:2604.22739v1 Announce Type: new Abstract: Social interactions dominate our perceptions of the world and shape our daily behavior by attaching social meaning to acts as simple and spontaneous as gestures, facial expressions, voice, and speech. People mimic and otherwise resp…

  2. arXiv cs.CV TIER_1 · Lijun Yin ·

    Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Analysis

    Social interactions dominate our perceptions of the world and shape our daily behavior by attaching social meaning to acts as simple and spontaneous as gestures, facial expressions, voice, and speech. People mimic and otherwise respond to each other's postures, facial expressions…