Publication: Analysis of JestKOD database using affective state annotations
dc.contributor.coauthor | N/A | |
dc.contributor.department | N/A | |
dc.contributor.department | Department of Computer Engineering | |
dc.contributor.department | Department of Computer Engineering | |
dc.contributor.kuauthor | Keçeci, Sinan | |
dc.contributor.kuauthor | Erzin, Engin | |
dc.contributor.kuauthor | Yemez, Yücel | |
dc.contributor.kuprofile | Master Student | |
dc.contributor.kuprofile | Faculty Member | |
dc.contributor.kuprofile | Faculty Member | |
dc.contributor.other | Department of Computer Engineering | |
dc.contributor.schoolcollegeinstitute | Graduate School of Sciences and Engineering | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.contributor.yokid | N/A | |
dc.contributor.yokid | 34503 | |
dc.contributor.yokid | 107907 | |
dc.date.accessioned | 2024-11-09T23:45:16Z | |
dc.date.issued | 2016 | |
dc.description.abstract | Gesticulation, together with the speech, is an important part of natural and affective human-human interaction. Analysis of gesticulation and speech is expected to help designing more natural human-computer interaction (HCI) systems. We build the JestKOD database, which consists of speech and motion capture recordings of dyadic interactions. In this paper we describe our annotation efforts and present the evaluations that we performed on the annotations of the JestKOD database. These evaluations suggest important findings for usability of the JestKOD database in analysis and modeling of HCI systems. | |
dc.description.indexedby | WoS | |
dc.description.indexedby | Scopus | |
dc.description.openaccess | YES | |
dc.description.publisherscope | International | |
dc.identifier.doi | 10.1109/SIU.2016.7495919 | |
dc.identifier.isbn | 9781-5090-1679-2 | |
dc.identifier.link | https://www.scopus.com/inward/record.uri?eid=2-s2.0-84982787194&doi=10.1109%2fSIU.2016.7495919&partnerID=40&md5=e3a18e1ace0eb217e30cf65a31af5f2b | |
dc.identifier.scopus | 2-s2.0-84982787194 | |
dc.identifier.uri | http://dx.doi.org/10.1109/SIU.2016.7495919 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/13804 | |
dc.identifier.wos | 391250900235 | |
dc.keywords | Affective state tracking | |
dc.keywords | Animation | |
dc.keywords | Computer-human interaction | |
dc.keywords | Gesticulation | |
dc.keywords | Machine learning | |
dc.keywords | Speech | |
dc.language | Turkish | |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | |
dc.source | 2016 24th Signal Processing and Communication Application Conference, SIU 2016 - Proceedings | |
dc.subject | Engineering | |
dc.subject | Electrical and electronic engineering | |
dc.title | Analysis of JestKOD database using affective state annotations | |
dc.title.alternative | JestKOD veritabanının duygu durum etiketlemelerini kullanarak analizi | |
dc.type | Conference proceeding | |
dspace.entity.type | Publication | |
local.contributor.authorid | N/A | |
local.contributor.authorid | 0000-0002-2715-2368 | |
local.contributor.authorid | 0000-0002-7515-3138 | |
local.contributor.kuauthor | Keçeci, Sinan | |
local.contributor.kuauthor | Erzin, Engin | |
local.contributor.kuauthor | Yemez, Yücel | |
relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
relation.isOrgUnitOfPublication.latestForDiscovery | 89352e43-bf09-4ef4-82f6-6f9d0174ebae |