Publication: Prosody-driven head-gesture animation
| dc.contributor.coauthor | Erdem, A. T. | |
| dc.contributor.coauthor | Erdem, C. | |
| dc.contributor.coauthor | Özkan, M. | |
| dc.contributor.department | Department of Electrical and Electronics Engineering | |
| dc.contributor.department | Department of Computer Engineering | |
| dc.contributor.department | Graduate School of Sciences and Engineering | |
| dc.contributor.kuauthor | Erzin, Engin | |
| dc.contributor.kuauthor | Sargın, Mehmet Emre | |
| dc.contributor.kuauthor | Tekalp, Ahmet Murat | |
| dc.contributor.kuauthor | Yemez, Yücel | |
| dc.contributor.schoolcollegeinstitute | College of Engineering | |
| dc.contributor.schoolcollegeinstitute | GRADUATE SCHOOL OF SCIENCES AND ENGINEERING | |
| dc.date.accessioned | 2024-11-09T23:22:33Z | |
| dc.date.issued | 2007 | |
| dc.description.abstract | We present a new framework for joint analysis of head gesture and speech prosody patterns of a speaker towards automatic realistic synthesis of head gestures from speech prosody. The proposed two-stage analysis aims to "learn" both elementary prosody and head gesture patterns for a particular speaker, as well as the correlations between these head gesture and prosody patterns from a training video sequence. The resulting audio-visual mapping model is then employed to synthesize natural head gestures from arbitrary input test speech given a head model for the speaker. Objective and subjective evaluations indicate that the proposed synthesis by analysis scheme provides natural looking head gestures for the speaker with any input test speech. | |
| dc.description.indexedby | WOS | |
| dc.description.indexedby | Scopus | |
| dc.description.openaccess | NO | |
| dc.description.publisherscope | International | |
| dc.description.sponsoredbyTubitakEu | N/A | |
| dc.description.sponsorship | European FP6 Network of Excellence SIMILAR | |
| dc.description.sponsorship | TUBITAK[EEEAG-106E201] | |
| dc.description.sponsorship | COST2102 action The authors would like to thank Momentum Inc. for making the talking head avatar available and for their collaboration to build the MVGL-MASAL gesture-speech database. This work has been supported by the European FP6 Network of Excellence SIMILAR (http://ww.similar.cc)and by TUBITAKunder project EEEAG-106E201 and COST2102 action). The work of M.E. Sargin was done while he worked at Koc University | |
| dc.identifier.issn | 1520-6149 | |
| dc.identifier.quartile | N/A | |
| dc.identifier.scopus | 2-s2.0-34547506366 | |
| dc.identifier.uri | https://hdl.handle.net/20.500.14288/11074 | |
| dc.identifier.wos | 248908100170 | |
| dc.keywords | Man-machine systems | |
| dc.keywords | Multimedia systems | |
| dc.keywords | Gesture and prosody analysis | |
| dc.keywords | Gesture synthesis | |
| dc.language.iso | eng | |
| dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | |
| dc.relation.ispartof | 2007 IEEE International Conference on Acoustics, Speech, and Signal Processing, Vol Ii, Pts 1-3 | |
| dc.subject | Acoustics | |
| dc.subject | Computer science | |
| dc.title | Prosody-driven head-gesture animation | |
| dc.type | Conference Proceeding | |
| dspace.entity.type | Publication | |
| local.contributor.kuauthor | Sargın, Mehmet Emre | |
| local.contributor.kuauthor | Erzin, Engin | |
| local.contributor.kuauthor | Yemez, Yücel | |
| local.contributor.kuauthor | Tekalp, Ahmet Murat | |
| local.publication.orgunit1 | GRADUATE SCHOOL OF SCIENCES AND ENGINEERING | |
| local.publication.orgunit1 | College of Engineering | |
| local.publication.orgunit2 | Department of Computer Engineering | |
| local.publication.orgunit2 | Department of Electrical and Electronics Engineering | |
| local.publication.orgunit2 | Graduate School of Sciences and Engineering | |
| person.familyName | Erzin | |
| person.familyName | Sargın | |
| person.familyName | Tekalp | |
| person.familyName | Yemez | |
| person.givenName | Engin | |
| person.givenName | Mehmet Emre | |
| person.givenName | Ahmet Murat | |
| person.givenName | Yücel | |
| relation.isOrgUnitOfPublication | 21598063-a7c5-420d-91ba-0cc9b2db0ea0 | |
| relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
| relation.isOrgUnitOfPublication | 3fc31c89-e803-4eb1-af6b-6258bc42c3d8 | |
| relation.isOrgUnitOfPublication.latestForDiscovery | 21598063-a7c5-420d-91ba-0cc9b2db0ea0 | |
| relation.isParentOrgUnitOfPublication | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 | |
| relation.isParentOrgUnitOfPublication | 434c9663-2b11-4e66-9399-c863e2ebae43 | |
| relation.isParentOrgUnitOfPublication.latestForDiscovery | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 |
