Publication:
Head nod detection in dyadic conversations

dc.contributor.coauthorN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentGraduate School of Sciences and Engineering
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorNumanoğlu, Tuğçe
dc.contributor.kuauthorSezgin, Tevfik Metin
dc.contributor.kuauthorYemez, Yücel
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGRADUATE SCHOOL OF SCIENCES AND ENGINEERING
dc.date.accessioned2024-11-09T22:59:34Z
dc.date.issued2019
dc.description.abstractIn face-to-face interactions, head gestures play an important role as one of the back-channel signals. As one of them, head nods can be used to display the approval or interest of listeners as a feedback in dyadic conversations. Hence detection of head nods is expected to improve understanding of the given feedback and to improve human-computer interaction. This study targets to detect head nods in the purpose of making human-computer interaction more human like. In the process, 3D head model is obtained by the Microsoft Kinect and the Openface application. Binary classification is performed on spectral features, which are extracted from 3D head motion, with the Support Vector Machine (SVM) classifier. Consequently, upon the classification, `head nod' or `not head nod' outputs are obtained. In the experimental studies, head nod detection accuracy is obtained as 92% for Microsoft Kinect and 91% for Openface over the Joker dataset.
dc.description.indexedbyWOS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuN/A
dc.identifier.doi10.1109/SIU.2019.8806231
dc.identifier.isbn9781-7281-1904-5
dc.identifier.scopus2-s2.0-85071983203
dc.identifier.urihttps://doi.org/10.1109/SIU.2019.8806231
dc.identifier.urihttps://hdl.handle.net/20.500.14288/7916
dc.identifier.wos518994300001
dc.keywordsBackhannels
dc.keywordsHead nodding
dc.keywordsHuman-Computer interaction
dc.keywordsIntention recognition
dc.keywordsNon-verbal expressions
dc.keywordsSocial signal processing
dc.language.isotur
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.ispartof27th Signal Processing and Communications Applications Conference, SIU 2019
dc.subjectEngineering
dc.subjectElectrical and electronic engineering
dc.subjectTelecommunications
dc.titleHead nod detection in dyadic conversations
dc.title.alternativeİkili iletişimde kafa sallama tespiti
dc.typeConference Proceeding
dspace.entity.typePublication
local.contributor.kuauthorNumanoğlu, Tuğçe
local.contributor.kuauthorErzin, Engin
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorSezgin, Tevfik Metin
local.publication.orgunit1GRADUATE SCHOOL OF SCIENCES AND ENGINEERING
local.publication.orgunit1College of Engineering
local.publication.orgunit2Department of Computer Engineering
local.publication.orgunit2Graduate School of Sciences and Engineering
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication3fc31c89-e803-4eb1-af6b-6258bc42c3d8
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication434c9663-2b11-4e66-9399-c863e2ebae43
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files