Publication:
Use of affect based interaction classification for continuous emotion tracking

dc.contributor.coauthorN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorKhaki, Hossein
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.date.accessioned2024-11-09T23:19:19Z
dc.date.issued2017
dc.description.abstractNatural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature of the dyadic interaction. In this paper, we extract two classes of the dyadic interaction based on temporal clustering of affective states. We use the k-means temporal clustering to define the interaction classes, and utilize support vector machine based classifier to estimate the interaction class types from multimodal, speech and motion, features. Then, we investigate the continuous emotion tracking problem over the dyadic interaction classes. We use the JESTKOD database, which consists of speech and full-body motion capture data recordings of dyadic interactions with affective annotations in activation, valence and dominance (AVD) attributes. The continuous affect tracking is executed as estimation of the AVD attributes. Experimental evaluation results attain statistically significant (p <; 0.05) improvements in affective state estimation using the interaction class information.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsorshipTUBITAK [113E102] This work is supported by TUBITAK under Grant Number 113E102.
dc.identifier.doi10.1109/ICASSP.2017.7952683
dc.identifier.isbn9781-5090-4117-6
dc.identifier.issn1520-6149
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85023780933&doi=10.1109%2fICASSP.2017.7952683&partnerID=40&md5=954810f08af7bd1c58a42dec1a13d8fe
dc.identifier.scopus2-s2.0-85023780933
dc.identifier.urihttp://dx.doi.org/10.1109/ICASSP.2017.7952683
dc.identifier.urihttps://hdl.handle.net/20.500.14288/10532
dc.identifier.wos414286203010
dc.keywordsDyadic interaction type
dc.keywordsHuman-computer interaction
dc.keywordsJESTKOD database
dc.keywordsMultimodal continuous emotion recognition
dc.languageEnglish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.sourceICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
dc.subjectAcoustics
dc.subjectEngineering
dc.subjectElectrical and electronic engineering
dc.titleUse of affect based interaction classification for continuous emotion tracking
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorKhaki, Hossein
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files