Publication:
Use of affect based interaction classification for continuous emotion tracking

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

N/A

Advisor

Publication Date

2017

Language

English

Type

Conference proceeding

Journal Title

Journal ISSN

Volume Title

Abstract

Natural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature of the dyadic interaction. In this paper, we extract two classes of the dyadic interaction based on temporal clustering of affective states. We use the k-means temporal clustering to define the interaction classes, and utilize support vector machine based classifier to estimate the interaction class types from multimodal, speech and motion, features. Then, we investigate the continuous emotion tracking problem over the dyadic interaction classes. We use the JESTKOD database, which consists of speech and full-body motion capture data recordings of dyadic interactions with affective annotations in activation, valence and dominance (AVD) attributes. The continuous affect tracking is executed as estimation of the AVD attributes. Experimental evaluation results attain statistically significant (p <; 0.05) improvements in affective state estimation using the interaction class information.

Description

Source:

ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings

Publisher:

Institute of Electrical and Electronics Engineers (IEEE)

Keywords:

Subject

Acoustics, Engineering, Electrical and electronic engineering

Citation

Endorsement

Review

Supplemented By

Referenced By

Copy Rights Note

0

Views

0

Downloads

View PlumX Details