Publication:
Head nod detection in dyadic conversations

dc.contributor.coauthorN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorNumanoğlu, Tuğçe
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorYemez, Yücel
dc.contributor.kuauthorSezgin, Tevfik Metin
dc.contributor.kuprofileMaster Student
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.contributor.yokid107907
dc.contributor.yokid18632
dc.date.accessioned2024-11-09T22:59:34Z
dc.date.issued2019
dc.description.abstractIn face-to-face interactions, head gestures play an important role as one of the back-channel signals. As one of them, head nods can be used to display the approval or interest of listeners as a feedback in dyadic conversations. Hence detection of head nods is expected to improve understanding of the given feedback and to improve human-computer interaction. This study targets to detect head nods in the purpose of making human-computer interaction more human like. In the process, 3D head model is obtained by the Microsoft Kinect and the Openface application. Binary classification is performed on spectral features, which are extracted from 3D head motion, with the Support Vector Machine (SVM) classifier. Consequently, upon the classification, `head nod' or `not head nod' outputs are obtained. In the experimental studies, head nod detection accuracy is obtained as 92% for Microsoft Kinect and 91% for Openface over the Joker dataset.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.identifier.doi10.1109/SIU.2019.8806231
dc.identifier.isbn9781-7281-1904-5
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85071983203&doi=10.1109%2fSIU.2019.8806231&partnerID=40&md5=adc87de6592e5f88453de00678025b88
dc.identifier.scopus2-s2.0-85071983203
dc.identifier.urihttp://dx.doi.org/10.1109/SIU.2019.8806231
dc.identifier.urihttps://hdl.handle.net/20.500.14288/7916
dc.identifier.wos518994300001
dc.keywordsBackhannels
dc.keywordsHead nodding
dc.keywordsHuman-Computer interaction
dc.keywordsIntention recognition
dc.keywordsNon-verbal expressions
dc.keywordsSocial signal processing
dc.languageTurkish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.source27th Signal Processing and Communications Applications Conference, SIU 2019
dc.subjectEngineering
dc.subjectElectrical and electronic engineering
dc.subjectTelecommunications
dc.titleHead nod detection in dyadic conversations
dc.title.alternativeİkili iletişimde kafa sallama tespiti
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.authorid0000-0002-7515-3138
local.contributor.authorid0000-0002-1524-1646
local.contributor.kuauthorNumanoğlu, Tuğçe
local.contributor.kuauthorErzin, Engin
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorSezgin, Tevfik Metin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files