Publication:
Multimodal analysis of speech prosody and upper body gestures using hidden semi-Markov models

Placeholder

School / College / Institute

Organizational Unit

Program

KU Authors

Co-Authors

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

Gesticulation is an essential component of face-to-face communication, and it contributes significantly to the natural and affective perception of human-to-human communication. In this work we investigate a new multimodal analysis framework to model relationships between intonational and gesture phrases using the hidden semi-Markov models (HSMMs). The HSMM framework effectively associates longer duration gesture phrases to shorter duration prosody clusters, while maintaining realistic gesture phrase duration statistics. We evaluate the multimodal analysis framework by generating speech prosody driven gesture animation, and employing both subjective and objective metrics.

Source

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Subject

Acoustics, Electrical electronics engineering

Citation

Has Part

Source

ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings

Book Series Title

Edition

DOI

10.1109/ICASSP.2013.6638339

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

0

Views

0

Downloads

View PlumX Details