Publication:
Affect-expressive hand gestures synthesis and animation

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorBozkurt, Elif
dc.contributor.kuauthorYemez, Yücel
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokid34503
dc.contributor.yokidN/A
dc.contributor.yokid107907
dc.date.accessioned2024-11-09T23:11:46Z
dc.date.issued2015
dc.description.abstractSpeech and hand gestures form a composite communicative signal that boosts the naturalness and affectiveness of the communication. We present a multimodal framework for joint analysis of continuous affect, speech prosody and hand gestures towards automatic synthesis of realistic hand gestures from spontaneous speech using the hidden semi-Markov models (HSMMs). To the best of our knowledge, this is the first attempt for synthesizing hand gestures using continuous dimensional affect space, i.e., activation, valence, and dominance. We model relationships between acoustic features describing speech prosody and hand gestures with and without using the continuous affect information in speaker independent configurations and evaluate the multimodal analysis framework by generating hand gesture animations, also via objective evaluations. Our experimental studies are promising, conveying the role of affect for modeling the dynamics of speech-gesture relationship. © 2015 IEEE.
dc.description.indexedbyScopus
dc.description.indexedbyWoS
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.volume2015-August
dc.identifier.doi10.1109/ICME.2015.7177478
dc.identifier.isbn9781-4799-7082-7
dc.identifier.issn1945-7871
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-84946017425anddoi=10.1109%2fICME.2015.7177478andpartnerID=40andmd5=c5c40ccca4c36267822065b2d36d9589
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-84946017425
dc.identifier.urihttp://dx.doi.org/10.1109/ICME.2015.7177478
dc.identifier.urihttps://hdl.handle.net/20.500.14288/9696
dc.identifier.wos380486500101
dc.keywordsContinuous affect
dc.keywordsGesture animation
dc.keywordsHidden semi-Markov models
dc.keywordsProsody analysis animation
dc.keywordsHidden markov models
dc.keywordsModal analysis
dc.keywordsSpeech
dc.keywordsAutomatic synthesis
dc.keywordscontinuous affect
dc.keywordsGesture animation
dc.keywordsHidden semi-Markov models
dc.keywordsMultimodal frameworks
dc.keywordsObjective evaluation
dc.keywordsProsody analysis
dc.keywordsSpeaker independents
dc.keywordsSpeech communication
dc.languageEnglish
dc.publisherIEEE
dc.sourceProceedings - IEEE International Conference on Multimedia and Expo
dc.subjectComputer engineering
dc.titleAffect-expressive hand gestures synthesis and animation
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-2715-2368
local.contributor.authoridN/A
local.contributor.authorid0000-0002-7515-3138
local.contributor.kuauthorErzin, Engin
local.contributor.kuauthorBozkurt, Elif
local.contributor.kuauthorYemez, Yücel
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files