Publication: Combined gesture-speech analysis and speech driven gesture synthesis
Program
KU Authors
Co-Authors
Sargin, M. E.
Aran, O.
Karpov, A.
Yasinnik, Y.
Wilson, S.
Advisor
Publication Date
2006
Language
English
Type
Conference proceeding
Journal Title
Journal ISSN
Volume Title
Abstract
Multimodal speech and speaker modeling and recognition are widely accepted as vital aspects of state of the art human-machine interaction systems. While correlations between speech and lip motion as well as speech and facial expressions are widely studied, relatively little work has been done to investigate the correlations between speech and gesture. Detection and modeling of head, hand and arm gestures of a speaker have been studied extensively and these gestures were shown to carry linguistic information. A typical example is the head gesture while saying "yes/no". In this study, correlation between gestures and speech is investigated. In speech signal analysis, keyword spotting and prosodic accent event detection has been performed. In gesture analysis, hand positions and parameters of global head motion arc used as features. The detection of gestures is based on discrete pre-designated symbol sets, which are manually labeled during the training phase. The gesture-speech correlation is modelled by examining the co-occurring speech and gesture patterns. This correlation can be used to fuse gesture and speech modalities for edutainment applications (i.e. video games, 3-D animations) where natural gestures of talking avatars is animated from speech. A speech driven gesture animation example has been implemented for demonstration.
Description
Source:
2006 IEEE International Conference on Multimedia and Expo - Icme 2006, Vols 1-5, Proceedings
Publisher:
IEEE
Keywords:
Subject
Computer science, Artificial intelligence, Imaging science, Photographic technology, Telecommunications