Researcher: Numanoğlu, Tuğçe
Name Variants
Numanoğlu, Tuğçe
Email Address
Birth Date
4 results
Search Results
Now showing 1 - 4 of 4
Publication Metadata only Realtime engagement measurement in human-computer interaction(Ieee, 2020) N/A; N/A; N/A; N/A; Department of Computer Engineering; Department of Computer Engineering; Department of Computer Engineering; Kesim, Ege; Numanoğlu, Tuğçe; Türker, Bekir Berker; Erzin, Engin; Yemez, Yücel; Sezgin, Tevfik Metin; Master Student; Master Student; PhD Student; Faculty Member; Faculty Member; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; College of Engineering; N/A; N/A; N/A; 34503; 107907; 18632Social robots are expected to understand their interlocutors and behave accordingly like humans do. Endowing robots with the capability of monitoring user engagement during their interactions with humans is one of the crucial steps towards achieving this goal. In this work, an interactive game is designed and implemented, which is played with a robot. During the interaction, the user engagement is monitored in realtime via detection of user gaze, turn-taking, laughters/smiles and head nods from audio-visual data. In the experiments conducted, the real-time monitored engagement is found to be consistent with the human-annotated engagement levels.Publication Metadata only The eHRI database: a multimodal database of engagement in human-robot interactions(Springer, 2023) N/A; N/A; N/A; N/A; N/A; Department of Computer Engineering; Department of Computer Engineering; Department of Computer Engineering; Kesim, Ege; Numanoğlu, Tuğçe; Bayramoğlu, Öykü Zeynep; Türker, Bekir Berker; Hussain, Nusrah; Sezgin, Tevfik Metin; Yemez, Yücel; Erzin, Engin; Master Student; Master Student; Master Student; Researcher; PhD Student; Faculty Member; Faculty Member; Faculty Member; Department of Computer Engineering; Koç Üniversitesi İş Bankası Yapay Zeka Uygulama ve Araştırma Merkezi (KUIS AI)/ Koç University İş Bank Artificial Intelligence Center (KUIS AI); Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; N/A; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; College of Engineering; N/A; N/A; N/A; N/A; N/A; 18632; 107907; 34503We present the engagement in human-robot interaction (eHRI) database containing natural interactions between two human participants and a robot under a story-shaping game scenario. The audio-visual recordings provided with the database are fully annotated at a 5-intensity scale for head nods and smiles, as well as with speech transcription and continuous engagement values. In addition, we present baseline results for the smile and head nod detection along with a real-time multimodal engagement monitoring system. We believe that the eHRI database will serve as a novel asset for research in affective human-robot interaction by providing raw data, annotations, and baseline results.Publication Metadata only Realtime engagement measurement in human-computer interaction(Institute of Electrical and Electronics Engineers Inc., 2020) Department of Computer Engineering; Department of Computer Engineering; Department of Computer Engineering; N/A; N/A; N/A; Sezgin, Tevfik Metin; Yemez, Yücel; Erzin, Engin; Türker, Bekir Berker; Numanoğlu, Tuğçe; Kesim, Ege; Faculty Member; Faculty Member; Faculty Member; PhD Student; Master Student; Master Student; Department of Computer Engineering; College of Engineering; College of Engineering; College of Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; Graduate School of Sciences and Engineering; 18632; 107907; 34503; N/A; N/A; N/ASocial robots are expected to understand their interlocutors and behave accordingly like humans do. Endowing robots with the capability of monitoring user engagement during their interactions with humans is one of the crucial steps towards achieving this goal. In this work, an interactive game is designed and implemented, which is played with a robot. During the interaction, the user engagement is monitored in realtime via detection of user gaze, turn-taking, laughters/smiles and head nods from audio-visual data. In the experiments conducted, the realtime monitored engagement is found to be consistent with the humanannotated engagement levels.Publication Metadata only Head nod detection in dyadic conversations(Institute of Electrical and Electronics Engineers (IEEE), 2019) N/A; N/A; Department of Computer Engineering; Department of Computer Engineering; Department of Computer Engineering; Numanoğlu, Tuğçe; Erzin, Engin; Yemez, Yücel; Sezgin, Tevfik Metin; Master Student; Faculty Member; Faculty Member; Faculty Member; Department of Computer Engineering; Graduate School of Sciences and Engineering; College of Engineering; College of Engineering; College of Engineering; N/A; 34503; 107907; 18632In face-to-face interactions, head gestures play an important role as one of the back-channel signals. As one of them, head nods can be used to display the approval or interest of listeners as a feedback in dyadic conversations. Hence detection of head nods is expected to improve understanding of the given feedback and to improve human-computer interaction. This study targets to detect head nods in the purpose of making human-computer interaction more human like. In the process, 3D head model is obtained by the Microsoft Kinect and the Openface application. Binary classification is performed on spectral features, which are extracted from 3D head motion, with the Support Vector Machine (SVM) classifier. Consequently, upon the classification, `head nod' or `not head nod' outputs are obtained. In the experimental studies, head nod detection accuracy is obtained as 92% for Microsoft Kinect and 91% for Openface over the Joker dataset.