Publication:
Affective synthesis and animation of arm gestures from speech prosody

Placeholder

School / College / Institute

Organizational Unit

Program

KU Authors

Co-Authors

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

In human-to-human communication, speech signals carry rich emotional cues that are further emphasized by affect-expressive gestures. In this regard, automatic synthesis and animation of gestures accompanying affective verbal communication can help to create more naturalistic virtual agents in human-computer interaction systems. Speech-driven gesture synthesis can map emotional cues of the speech signal to affect-expressive gestures by modeling complex variability and timing relationships of speech and gesture. In this paper, we investigate the use of continuous affect attributes, which are activation, valence and dominance, for speech-driven affective synthesis and animation of arm gestures. To this effect, we present a statistical framework based on hidden semi-Markov models (HSMM), where states are gestures and observations are speech-prosody and continuous affect attributes. The proposed framework is evaluated considering four distinct HSMM structures which differ by their emission distributions. Evaluations are performed over the USC CreativeIT database in a speaker-independent setup. Among the four statistical structures, the conditional structure, which models observation distributions as prosody given affect, achieves the best performance under both objective and subjective evaluations.

Source

Publisher

Elsevier

Subject

Acoustics, Computer science

Citation

Has Part

Source

Speech Communication

Book Series Title

Edition

DOI

10.1016/j.specom.2020.02.005

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

0

Views

0

Downloads

View PlumX Details