Publication:
Use of agreement/disagreement classification in dyadic interactions for continuous emotion recognition

dc.contributor.coauthorN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorKhaki, Hossein
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.date.accessioned2024-11-09T23:34:06Z
dc.date.issued2016
dc.description.abstractNatural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature or type of the dyadic interaction. In this study, we investigate relationship between affective attributes and nature of dyadic interaction. In this investigation we use the JESTKOD database, which consists of speech and full-body motion capture data recordings for dyadic interactions under agreement and disagreement scenarios. The dataset also has affective annotations in activation, valence and dominance (AVD) attributes. We pose the continuous affect recognition problem under agreement and disagreement scenarios of dyadic interactions. We define a statistical mapping using the support vector regression (SVR) from speech and motion modalities to affective attributes with and without the dyadic interaction type (DIT) information. We observe an improvement in estimation of the valence attribute when the DIT is available. Furthermore this improvement sustains even we estimate the DIT from the speech and motion modalities of the dyadic interaction.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessNO
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.description.sponsorshipTUBITAK[113E102] This work was supported by TUBITAKunder Grant Number 113E102.
dc.identifier.doi10.21437/interspeech.2016-407
dc.identifier.isbn978-1-5108-3313-5
dc.identifier.issn2308-457X
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-84994376910
dc.identifier.urihttp://dx.doi.org/10.21437/interspeech.2016-407
dc.identifier.urihttps://hdl.handle.net/20.500.14288/12271
dc.identifier.wos409394400126
dc.keywordsMultimodal continuous emotion recognition
dc.keywordsHuman-computer interaction
dc.keywordsDyadic interaction type
dc.languageEnglish
dc.publisherIsca-int Speech Communication assoc
dc.source17th Annual Conference of the International Speech Communication Association (interspeech 2016), Vols 1-5: Understanding Speech Processing in Humans and Machines
dc.subjectAcoustics
dc.subjectComputer science
dc.subjectArtificial intelligence
dc.subjectEngineering
dc.subjectElectrical and electronic engineering
dc.subjectLinguistics
dc.titleUse of agreement/disagreement classification in dyadic interactions for continuous emotion recognition
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorKhaki, Hossein
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files