Publication: Audio-driven human body motion analysis and synthesis
Program
KU Authors
Co-Authors
Canton-Ferrer, C.
Tilmanne, J.
Bozkurt, E.
Publication Date
Language
Embargo Status
Journal Title
Journal ISSN
Volume Title
Alternative Title
Abstract
This paper presents a framework for audio-driven human body motion analysis and synthesis. We address the problem in the context of a dance performance, where gestures and movements of the dancer are mainly driven by a musical piece and characterized by the repetition of a set of dance figures. The system is trained in a supervised manner using the multiview video recordings of the dancer. The human body posture is extracted from multiview video information without any human intervention using a novel marker-based algorithm based on annealing particle filtering. Audio is analyzed to extract beat and tempo information. The joint analysis of audio and motion features provides a correlation model that is then used to animate a dancing avatar when driven with any musical piece of the same genre. Results are provided showing the effectiveness of the proposed algorithm.
Source
Publisher
IEEE
Subject
Acoustics, Computer science, Artificial intelligence, Cybernetics, Engineering, Biomedical engineering, Electrical and electronic engineering, Computational biology, Imaging science, Photographic technology, Radiology, Nuclear medicine, Medical imaging, Telecommunications
Citation
Has Part
Source
2008 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-12
Book Series Title
Edition
DOI
10.1109/ICASSP.2008.4518089