Publication:
The JESTKOD database: an affective multimodal database of dyadic interactions

Placeholder

School / College / Institute

Organizational Unit

Program

KU Authors

Co-Authors

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

Abstract

in human-to-human communication, gesture and speech co-exist in time with a tight synchrony, and gestures are often utilized to complement or to emphasize speech. in human-computer interaction systems, natural, Affective and believable use of gestures would be a valuable key component in adopting and emphasizing human-centered aspects. However, natural and affective multimodal data, for studying computational models of gesture and speech, is limited. in this study, we introduce the JESTKOD database, which consists of speech and full-body motion capture data recordings in dyadic interaction setting under agreement and disagreement scenarios. Participants of the dyadic interactions are native Turkish speakers and recordings of each participant are rated in dimensional affect space. We present our multimodal data collection and annotation process, As well as our preliminary experimental studies on agreement/disagreement classification of dyadic interactions using body gesture and speech data. the JESTKOD database provides a valuable asset to investigate gesture and speech towards designing more natural and affective human-computer interaction systems.

Source

Publisher

Springer

Subject

Computer science

Citation

Has Part

Source

Language Resources and Evaluation

Book Series Title

Edition

DOI

10.1007/s10579-016-9377-0

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

2

Views

0

Downloads

View PlumX Details