Publication:
Affect burst detection using multi-modal cues

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

Advisor

Publication Date

2015

Language

Turkish

Type

Conference proceeding

Journal Title

Journal ISSN

Volume Title

Abstract

Recently, affect bursts have gained significant importance in the field of emotion recognition since they can serve as prior in recognising underlying affect bursts. In this paper we propose a data driven approach for detecting affect bursts using multimodal streams of input such as audio and facial landmark points. The proposed Gaussian Mixture Model based method learns each modality independently followed by combining the probabilistic outputs to form a decision. This gives us an edge over feature fusion based methods as it allows us to handle events when one of the modalities is too noisy or not available. We demonstrate robustness of the proposed approach on 'Interactive emotional dyadic motion capture database' (IEMOCAP) which contains realistic and natural dyadic conversations. This database is annotated by three annotators to segment and label affect bursts to be used for training and testing purposes. We also present performance comparison between SVM based methods and GMM based methods for the same configuration of experiments.

Description

Source:

2015 23rd Signal Processing and Communications Applications Conference, SIU 2015 - Proceedings

Publisher:

IEEE

Keywords:

Subject

Computer engineering

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note

0

Views

0

Downloads

View PlumX Details