Publication:
Multimodal prediction of head nods in dyadic conversations

Placeholder

School / College / Institute

Organizational Unit

Program

KU Authors

Co-Authors

Publication Date

Language

Embargo Status

Journal Title

Journal ISSN

Volume Title

Alternative Title

İkili iletişimde olası kafa sallama anlarının çok kipli kestirimi

Abstract

Non-verbal expressions in human interactions carry important messages. These messages, which constitute a significant part of the information to be transferred, are not used effectively by machines in human-robot/agent interaction. In this study, the purpose is to predict the potential head nod moments for robot/agent and therefore to develop more human-like interfaces. To achieve this, acoustic feature extraction and social signal annotations are carried out on human-human dyadic conversations. A certain history window for each head nod instances are fed to binary classification. Consequently, upon the classification by Support Vector Machines, 'potential head nod' or 'no head nod' outputs are obtained. More than half of the head nods are succesfully predicted as 'potential head nod', which leads promising results for human-like robot/agents.

Source

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Subject

Civil engineering, Electrical electronics engineering, Telecommunication

Citation

Has Part

Source

26th IEEE Signal Processing and Communications Applications Conference, SIU 2018

Book Series Title

Edition

DOI

10.1109/SIU.2018.8404737

item.page.datauri

Link

Rights

Copyrights Note

Endorsement

Review

Supplemented By

Referenced By

0

Views

0

Downloads

View PlumX Details