Publication:
Multimodal prediction of head nods in dyadic conversations

Placeholder

Organizational Units

Program

KU Authors

Co-Authors

Advisor

Publication Date

Language

Turkish

Journal Title

Journal ISSN

Volume Title

Abstract

Non-verbal expressions in human interactions carry important messages. These messages, which constitute a significant part of the information to be transferred, are not used effectively by machines in human-robot/agent interaction. In this study, the purpose is to predict the potential head nod moments for robot/agent and therefore to develop more human-like interfaces. To achieve this, acoustic feature extraction and social signal annotations are carried out on human-human dyadic conversations. A certain history window for each head nod instances are fed to binary classification. Consequently, upon the classification by Support Vector Machines, 'potential head nod' or 'no head nod' outputs are obtained. More than half of the head nods are succesfully predicted as 'potential head nod', which leads promising results for human-like robot/agents.

Source:

26th IEEE Signal Processing and Communications Applications Conference, SIU 2018

Publisher:

Institute of Electrical and Electronics Engineers (IEEE)

Keywords:

Subject

Civil engineering, Electrical electronics engineering, Telecommunication

Citation

Endorsement

Review

Supplemented By

Referenced By

Copyrights Note

0

Views

0

Downloads

View PlumX Details