Publication:
Multimodal analysis of speech and arm motion for prosody-driven synthesis of beat gestures

dc.contributor.coauthorN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentGraduate School of Sciences and Engineering
dc.contributor.kuauthorBozkurt, Elif
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorYemez, Yücel
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGRADUATE SCHOOL OF SCIENCES AND ENGINEERING
dc.date.accessioned2024-11-09T23:03:21Z
dc.date.issued2016
dc.description.abstractWe propose a framework for joint analysis of speech prosody and arm motion towards automatic synthesis and realistic animation of beat gestures from speech prosody and rhythm. In the analysis stage, we first segment motion capture data and speech audio into gesture phrases and prosodic units via temporal clustering, and assign a class label to each resulting gesture phrase and prosodic unit. We then train a discrete hidden semi-Markov model (HSMM) over the segmented data, where gesture labels are hidden states with duration statistics and frame-level prosody labels are observations. The HSMM structure allows us to effectively map sequences of shorter duration prosodic units to longer duration gesture phrases. In the analysis stage, we also construct a gesture pool consisting of gesture phrases segmented from the available dataset, where each gesture phrase is associated with a class label and speech rhythm representation. In the synthesis stage, we use a modified Viterbi algorithm with a duration model, that decodes the optimal gesture label sequence with duration information over the HSMM, given a sequence of prosody labels. In the animation stage, the synthesized gesture label sequence with duration and speech rhythm information is mapped into a motion sequence by using a multiple objective unit selection algorithm. Our framework is tested using two multimodal datasets in speaker-dependent and independent settings. The resulting motion sequence when accompanied with the speech input yields natural-looking and plausible animations. We use objective evaluations to set parameters of the proposed prosody-driven gesture animation system, and subjective evaluations to assess quality of the resulting animations. The conducted subjective evaluations show that the difference between the proposed HSMM based synthesis and the motion capture synthesis is not statistically significant. Furthermore, the proposed HSMM based synthesis is evaluated significantly better than a baseline synthesis which animates random gestures based on only joint angle continuity. (C) 2016 Elsevier B.V. All rights reserved.
dc.description.indexedbyWOS
dc.description.indexedbyScopus
dc.description.openaccessNO
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.volume85
dc.identifier.doi10.1016/j.specom.2016.10.004
dc.identifier.eissn1872-7182
dc.identifier.issn0167-6393
dc.identifier.quartileQ2
dc.identifier.scopus2-s2.0-84992755423
dc.identifier.urihttps://doi.org/10.1016/j.specom.2016.10.004
dc.identifier.urihttps://hdl.handle.net/20.500.14288/8456
dc.identifier.wos390507000004
dc.keywordsJoint analysis of speech and gesture
dc.keywordsSpeech-driven gesture animation
dc.keywordsProsody-driven gesture synthesis
dc.keywordsSpeech rhythm
dc.keywordsUnit selection
dc.keywordsHidden semi-Markov models
dc.keywordsUtterances
dc.language.isoeng
dc.publisherElsevier
dc.relation.grantnoTurk Telekom [11315-02]
dc.relation.grantnoTUBITAK[113E102] This work was supported by Turk Telekom under Grant Number 11315-02 and by TUBITAKunder Grant Number 113E102.
dc.relation.ispartofSpeech Communication
dc.subjectAcoustics
dc.subjectComputer science
dc.titleMultimodal analysis of speech and arm motion for prosody-driven synthesis of beat gestures
dc.typeJournal Article
dspace.entity.typePublication
local.contributor.kuauthorBozkurt, Elif
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorErzin, Engin
local.publication.orgunit1GRADUATE SCHOOL OF SCIENCES AND ENGINEERING
local.publication.orgunit1College of Engineering
local.publication.orgunit2Department of Computer Engineering
local.publication.orgunit2Graduate School of Sciences and Engineering
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication3fc31c89-e803-4eb1-af6b-6258bc42c3d8
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication434c9663-2b11-4e66-9399-c863e2ebae43
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files