Publication: Use of agreement/disagreement classification in dyadic interactions for continuous emotion recognition
Program
KU-Authors
KU Authors
Co-Authors
N/A
Advisor
Publication Date
2016
Language
English
Type
Conference proceeding
Journal Title
Journal ISSN
Volume Title
Abstract
Natural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature or type of the dyadic interaction. In this study, we investigate relationship between affective attributes and nature of dyadic interaction. In this investigation we use the JESTKOD database, which consists of speech and full-body motion capture data recordings for dyadic interactions under agreement and disagreement scenarios. The dataset also has affective annotations in activation, valence and dominance (AVD) attributes. We pose the continuous affect recognition problem under agreement and disagreement scenarios of dyadic interactions. We define a statistical mapping using the support vector regression (SVR) from speech and motion modalities to affective attributes with and without the dyadic interaction type (DIT) information. We observe an improvement in estimation of the valence attribute when the DIT is available. Furthermore this improvement sustains even we estimate the DIT from the speech and motion modalities of the dyadic interaction.
Description
Source:
17th Annual Conference of the International Speech Communication Association (interspeech 2016), Vols 1-5: Understanding Speech Processing in Humans and Machines
Publisher:
Isca-int Speech Communication assoc
Keywords:
Subject
Acoustics, Computer science, Artificial intelligence, Engineering, Electrical and electronic engineering, Linguistics