Publication: Multi-modal analysis of dance performances for music-driven choreography synthesis
dc.contributor.coauthor | N/A | |
dc.contributor.department | Department of Electrical and Electronics Engineering | |
dc.contributor.department | Department of Computer Engineering | |
dc.contributor.department | Graduate School of Sciences and Engineering | |
dc.contributor.kuauthor | Erzin, Engin | |
dc.contributor.kuauthor | Ofli, Ferda | |
dc.contributor.kuauthor | Tekalp, Ahmet Murat | |
dc.contributor.kuauthor | Yemez, Yücel | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.contributor.schoolcollegeinstitute | GRADUATE SCHOOL OF SCIENCES AND ENGINEERING | |
dc.date.accessioned | 2024-11-09T22:52:17Z | |
dc.date.issued | 2010 | |
dc.description.abstract | We propose a framework for modeling, analysis, annotation and synthesis of multi-modal dance performances. We analyze correlations between music features and dance figure labels on training dance videos in order to construct a mapping from music measures (segments) to dance figures towards generating music-driven dance choreographies. We assume that dance figure segment boundaries coincide with music measures (audio boundaries). For each training video, figure segments are manually labeled by an expert to indicate the type of dance motion. Chroma features of each measure are used for music analysis. We model temporal statistics of such chroma features corresponding to each dance figure label to identify different rhythmic patterns for that dance motion. The correlations between dance figures and music measures, as well as, correlations between consecutive dance figures are used to construct a mapping for music-driven dance choreography synthesis. Experimental results demonstrate the success of proposed music-driven choreography synthesis framework. | |
dc.description.indexedby | WOS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | NO | |
dc.description.sponsoredbyTubitakEu | N/A | |
dc.identifier.doi | 10.1109/ICASSP.2010.5494891 | |
dc.identifier.isbn | 978-1-4244-4296-6 | |
dc.identifier.issn | 1520-6149 | |
dc.identifier.scopus | 2-s2.0-78049410117 | |
dc.identifier.uri | https://doi.org/10.1109/ICASSP.2010.5494891 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/6981 | |
dc.identifier.wos | 287096002104 | |
dc.keywords | Multimodal dance modeling | |
dc.keywords | Music-driven dance | |
dc.keywords | Choreography synthesis | |
dc.keywords | Beat tracking | |
dc.keywords | Audio | |
dc.language.iso | eng | |
dc.publisher | IEEE | |
dc.relation.ispartof | 2010 IEEE International Conference on Acoustics, Speech, and Signal Processing | |
dc.subject | Acoustics | |
dc.subject | Computer science | |
dc.subject | Engineering | |
dc.subject | Electrical and electronic engineering | |
dc.title | Multi-modal analysis of dance performances for music-driven choreography synthesis | |
dc.type | Conference Proceeding | |
dspace.entity.type | Publication | |
local.contributor.kuauthor | Ofli, Ferda | |
local.contributor.kuauthor | Erzin, Engin | |
local.contributor.kuauthor | Yemez, Yücel | |
local.contributor.kuauthor | Tekalp, Ahmet Murat | |
local.publication.orgunit1 | GRADUATE SCHOOL OF SCIENCES AND ENGINEERING | |
local.publication.orgunit1 | College of Engineering | |
local.publication.orgunit2 | Department of Computer Engineering | |
local.publication.orgunit2 | Department of Electrical and Electronics Engineering | |
local.publication.orgunit2 | Graduate School of Sciences and Engineering | |
relation.isOrgUnitOfPublication | 21598063-a7c5-420d-91ba-0cc9b2db0ea0 | |
relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
relation.isOrgUnitOfPublication | 3fc31c89-e803-4eb1-af6b-6258bc42c3d8 | |
relation.isOrgUnitOfPublication.latestForDiscovery | 21598063-a7c5-420d-91ba-0cc9b2db0ea0 | |
relation.isParentOrgUnitOfPublication | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 | |
relation.isParentOrgUnitOfPublication | 434c9663-2b11-4e66-9399-c863e2ebae43 | |
relation.isParentOrgUnitOfPublication.latestForDiscovery | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 |