Publication:
The JESTKOD database: an affective multimodal database of dyadic interactions

dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorBozkurt, Elif
dc.contributor.kuauthorKhaki, Hossein
dc.contributor.kuauthorKeçeci, Sinan
dc.contributor.kuauthorTürker, Bekir Berker
dc.contributor.kuauthorYemez, Yücel
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileMaster Student
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokid107907
dc.contributor.yokid34503
dc.date.accessioned2024-11-10T00:04:59Z
dc.date.issued2017
dc.description.abstractin human-to-human communication, gesture and speech co-exist in time with a tight synchrony, and gestures are often utilized to complement or to emphasize speech. in human-computer interaction systems, natural, Affective and believable use of gestures would be a valuable key component in adopting and emphasizing human-centered aspects. However, natural and affective multimodal data, for studying computational models of gesture and speech, is limited. in this study, we introduce the JESTKOD database, which consists of speech and full-body motion capture data recordings in dyadic interaction setting under agreement and disagreement scenarios. Participants of the dyadic interactions are native Turkish speakers and recordings of each participant are rated in dimensional affect space. We present our multimodal data collection and annotation process, As well as our preliminary experimental studies on agreement/disagreement classification of dyadic interactions using body gesture and speech data. the JESTKOD database provides a valuable asset to investigate gesture and speech towards designing more natural and affective human-computer interaction systems.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.issue3
dc.description.openaccessNO
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.volume51
dc.identifier.doi10.1007/s10579-016-9377-0
dc.identifier.eissn1574-0218
dc.identifier.issn1574-020X
dc.identifier.quartileQ3
dc.identifier.scopus2-s2.0-84997142658
dc.identifier.urihttp://dx.doi.org/10.1007/s10579-016-9377-0
dc.identifier.urihttps://hdl.handle.net/20.500.14288/16367
dc.identifier.wos407360600011
dc.keywordsGesture
dc.keywordsSpeech
dc.keywordsaffective state tracking
dc.keywordsHuman-computer interaction
dc.keywordsDyadic interaction
dc.languageEnglish
dc.publisherSpringer
dc.sourceLanguage Resources and Evaluation
dc.subjectComputer science
dc.titleThe JESTKOD database: an affective multimodal database of dyadic interactions
dc.typeJournal Article
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.authorid0000-0002-7515-3138
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorBozkurt, Elif
local.contributor.kuauthorKhaki, Hossein
local.contributor.kuauthorKeçeci, Sinan
local.contributor.kuauthorTürker, Bekir Berker
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files