Publication:
Speech driven 3D head gesture synthesis

Placeholder

Program

KU Authors

Co-Authors

Erdem, A. Tanju

Advisor

Publication Date

Language

Turkish

Journal Title

Journal ISSN

Volume Title

Abstract

In this paper, we present a speech driven natural head gesture analysis and synthesis system. The proposed system assumes that sharp head movements are correlated with prominence in speech. For analysis, a binocular camera system is employed to capture the head motion of a talking person. The motion parameters associated with the 3D head motion are then used for extraction of the repetitive head gestures. In parallel, prosodic events are detected using an HMM structure with pitch and formant frequencies and speech intensity as audio features. For synthesis, the head motion parameters are estimated from the prosodic events based on a gesture-speech correlation model and then the associated Euler angles are used for speech driven animation of a 3D personalized talking head model. Results on head motion feature extraction, prosodic event detection and correlation modelling are provided..

Source:

2006 IEEE 14th Signal Processing And Communications Applications, Vols 1 And 2

Publisher:

IEEE

Keywords:

Subject

Computer Science, Artificial intelligence, Electrical electronics engineering, Imaging systems, Photography

Citation

Endorsement

Review

Supplemented By

Referenced By

Copyrights Note

0

Views

0

Downloads

View PlumX Details