Publication: Affect burst detection using multi-modal cues
Program
KU Authors
Co-Authors
Publication Date
Language
Embargo Status
Journal Title
Journal ISSN
Volume Title
Alternative Title
Çok-kipli modelleme ile duygusal patlama sezimi
Abstract
Recently, affect bursts have gained significant importance in the field of emotion recognition since they can serve as prior in recognising underlying affect bursts. In this paper we propose a data driven approach for detecting affect bursts using multimodal streams of input such as audio and facial landmark points. The proposed Gaussian Mixture Model based method learns each modality independently followed by combining the probabilistic outputs to form a decision. This gives us an edge over feature fusion based methods as it allows us to handle events when one of the modalities is too noisy or not available. We demonstrate robustness of the proposed approach on 'Interactive emotional dyadic motion capture database' (IEMOCAP) which contains realistic and natural dyadic conversations. This database is annotated by three annotators to segment and label affect bursts to be used for training and testing purposes. We also present performance comparison between SVM based methods and GMM based methods for the same configuration of experiments.
Source
Publisher
IEEE
Subject
Computer engineering
Citation
Has Part
Source
2015 23rd Signal Processing and Communications Applications Conference, SIU 2015 - Proceedings
Book Series Title
Edition
DOI
10.1109/SIU.2015.7130002