Publication:
The JESTKOD database: an affective multimodal database of dyadic interactions

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentGraduate School of Sciences and Engineering
dc.contributor.kuauthorBozkurt, Elif
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorKeçeci, Sinan
dc.contributor.kuauthorKhaki, Hossein
dc.contributor.kuauthorTürker, Bekir Berker
dc.contributor.kuauthorYemez, Yücel
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGRADUATE SCHOOL OF SCIENCES AND ENGINEERING
dc.date.accessioned2024-11-10T00:04:59Z
dc.date.issued2017
dc.description.abstractin human-to-human communication, gesture and speech co-exist in time with a tight synchrony, and gestures are often utilized to complement or to emphasize speech. in human-computer interaction systems, natural, Affective and believable use of gestures would be a valuable key component in adopting and emphasizing human-centered aspects. However, natural and affective multimodal data, for studying computational models of gesture and speech, is limited. in this study, we introduce the JESTKOD database, which consists of speech and full-body motion capture data recordings in dyadic interaction setting under agreement and disagreement scenarios. Participants of the dyadic interactions are native Turkish speakers and recordings of each participant are rated in dimensional affect space. We present our multimodal data collection and annotation process, As well as our preliminary experimental studies on agreement/disagreement classification of dyadic interactions using body gesture and speech data. the JESTKOD database provides a valuable asset to investigate gesture and speech towards designing more natural and affective human-computer interaction systems.
dc.description.indexedbyWOS
dc.description.indexedbyScopus
dc.description.issue3
dc.description.openaccessNO
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.volume51
dc.identifier.doi10.1007/s10579-016-9377-0
dc.identifier.eissn1574-0218
dc.identifier.issn1574-020X
dc.identifier.quartileQ3
dc.identifier.scopus2-s2.0-84997142658
dc.identifier.urihttps://doi.org/10.1007/s10579-016-9377-0
dc.identifier.urihttps://hdl.handle.net/20.500.14288/16367
dc.identifier.wos407360600011
dc.keywordsGesture
dc.keywordsSpeech
dc.keywordsaffective state tracking
dc.keywordsHuman-computer interaction
dc.keywordsDyadic interaction
dc.language.isoeng
dc.publisherSpringer
dc.relation.ispartofLanguage Resources and Evaluation
dc.subjectComputer science
dc.titleThe JESTKOD database: an affective multimodal database of dyadic interactions
dc.typeJournal Article
dspace.entity.typePublication
local.contributor.kuauthorBozkurt, Elif
local.contributor.kuauthorKhaki, Hossein
local.contributor.kuauthorKeçeci, Sinan
local.contributor.kuauthorTürker, Bekir Berker
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorErzin, Engin
local.publication.orgunit1GRADUATE SCHOOL OF SCIENCES AND ENGINEERING
local.publication.orgunit1College of Engineering
local.publication.orgunit2Department of Computer Engineering
local.publication.orgunit2Graduate School of Sciences and Engineering
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication3fc31c89-e803-4eb1-af6b-6258bc42c3d8
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication434c9663-2b11-4e66-9399-c863e2ebae43
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files