Publication:
Affect burst recognition using multi-modal cues

Placeholder

School / College / Institute

Organizational Unit

Program

KU Authors

Co-Authors

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

Çok-kipli ipuçlari kullanarak duygusal patlama tanıma

Abstract

Affect bursts, which are nonverbal expressions of emotions in conversations, play a critical role in analyzing affective states. Although there exist a number of methods on affect burst detection and recognition using only audio information, little effort has been spent for combining cues in a multi-modal setup. We suggest that facial gestures constitute a key component to characterize affect bursts, and hence have potential for more robust affect burst detection and recognition. We take a data-driven approach to characterize affect bursts using Hidden Markov Models (HMM), and employ a multimodal decision fusion scheme that combines cues from audio and facial gestures for classification of affect bursts. We demonstrate the contribution of facial gestures to affect burst recognition by conducting experiments on an audiovisual database which comprise speech and facial motion data belonging to various dyadic conversations.

Source

Publisher

IEEE Computer Society

Subject

Civil engineering, Electrical electronics engineering, Telecommunication

Citation

Has Part

Source

2014 22nd Signal Processing and Communications Applications Conference, SIU 2014 - Proceedings

Book Series Title

Edition

DOI

10.1109/SIU.2014.6830552

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

0

Views

0

Downloads

View PlumX Details