Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Analysis
Xiang Zhang, Xiaotian Li, Taoyue Wang, Nan Bi, Xin Zhou + 7 more
TLDR
Inter-Stance introduces a 20TB multimodal dyadic corpus for conversational stance analysis, enabling novel modeling of interpersonal behavior.
Key contributions
- Introduces Inter-Stance, a 20TB multimodal corpus for dyadic conversational stance analysis.
- Features 45 dyads (90 persons) with synchronized 2D/3D face, thermal, voice, and physiology data.
- Includes self-reported affect and annotations for agreement, disagreement, and neutral stance.
- Compares dyads with and without shared history, enabling novel interpersonal behavior modeling.
Why it matters
This paper addresses the critical lack of publicly available multimodal dyadic interaction datasets. The comprehensive Inter-Stance corpus enables unprecedented modeling of social interactions, advancing research in understanding human social dynamics and communication.
Original Abstract
Social interactions dominate our perceptions of the world and shape our daily behavior by attaching social meaning to acts as simple and spontaneous as gestures, facial expressions, voice, and speech. People mimic and otherwise respond to each other's postures, facial expressions, mannerisms, and other verbal and nonverbal behavior, and form appraisals or evaluations in the process. Yet, no publicly-available dataset includes multimodal recordings and self-report measures of multiple persons in social interaction. Dyadic recordings and annotation are lacking. We present a new data corpus of multimodal dyadic interaction (45 dyads, 90 persons) that includes synchronized multi-modality behavior (2D face video, 3D face geometry, thermal spectrum dynamics, voice and speech behavior, physiology (PPG, EDA, heart-rate, blood pressure, and respiration), and self-reported affect of all participants in a communicative interaction scenario. Two types of dyads are included: persons with shared past history and strangers. Annotations include social signals, agreement, disagreement, and neutral stance. With a potent emotion induction, these multimodal data will enable novel modeling of multimodal interpersonal behavior. We present extensive experiments to evaluate multimodal dyadic communication of dyads with and without interpersonal history, and their affect. This new database will make multimodal modeling of social interaction never possible before. The dataset includes 20TB of multimodal data to share with the research community.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.