Publication: Use of affect based interaction classification for continuous emotion tracking
dc.contributor.coauthor | N/A | |
dc.contributor.department | N/A | |
dc.contributor.department | Department of Computer Engineering | |
dc.contributor.kuauthor | Khaki, Hossein | |
dc.contributor.kuauthor | Erzin, Engin | |
dc.contributor.kuprofile | PhD Student | |
dc.contributor.kuprofile | Faculty Member | |
dc.contributor.other | Department of Computer Engineering | |
dc.contributor.schoolcollegeinstitute | Graduate School of Sciences and Engineering | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.contributor.yokid | N/A | |
dc.contributor.yokid | 34503 | |
dc.date.accessioned | 2024-11-09T23:19:19Z | |
dc.date.issued | 2017 | |
dc.description.abstract | Natural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature of the dyadic interaction. In this paper, we extract two classes of the dyadic interaction based on temporal clustering of affective states. We use the k-means temporal clustering to define the interaction classes, and utilize support vector machine based classifier to estimate the interaction class types from multimodal, speech and motion, features. Then, we investigate the continuous emotion tracking problem over the dyadic interaction classes. We use the JESTKOD database, which consists of speech and full-body motion capture data recordings of dyadic interactions with affective annotations in activation, valence and dominance (AVD) attributes. The continuous affect tracking is executed as estimation of the AVD attributes. Experimental evaluation results attain statistically significant (p <; 0.05) improvements in affective state estimation using the interaction class information. | |
dc.description.indexedby | WoS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | YES | |
dc.description.publisherscope | International | |
dc.description.sponsorship | TUBITAK [113E102] This work is supported by TUBITAK under Grant Number 113E102. | |
dc.identifier.doi | 10.1109/ICASSP.2017.7952683 | |
dc.identifier.isbn | 9781-5090-4117-6 | |
dc.identifier.issn | 1520-6149 | |
dc.identifier.link | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85023780933&doi=10.1109%2fICASSP.2017.7952683&partnerID=40&md5=954810f08af7bd1c58a42dec1a13d8fe | |
dc.identifier.scopus | 2-s2.0-85023780933 | |
dc.identifier.uri | http://dx.doi.org/10.1109/ICASSP.2017.7952683 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/10532 | |
dc.identifier.wos | 414286203010 | |
dc.keywords | Dyadic interaction type | |
dc.keywords | Human-computer interaction | |
dc.keywords | JESTKOD database | |
dc.keywords | Multimodal continuous emotion recognition | |
dc.language | English | |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | |
dc.source | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings | |
dc.subject | Acoustics | |
dc.subject | Engineering | |
dc.subject | Electrical and electronic engineering | |
dc.title | Use of affect based interaction classification for continuous emotion tracking | |
dc.type | Conference proceeding | |
dspace.entity.type | Publication | |
local.contributor.authorid | N/A | |
local.contributor.authorid | 0000-0002-2715-2368 | |
local.contributor.kuauthor | Khaki, Hossein | |
local.contributor.kuauthor | Erzin, Engin | |
relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
relation.isOrgUnitOfPublication.latestForDiscovery | 89352e43-bf09-4ef4-82f6-6f9d0174ebae |