Publication:
Affect burst recognition using multi-modal cues

dc.contributor.departmentN/A
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorTürker, Bekir Berker
dc.contributor.kuauthorMarzban, Shabbir
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorYemez, Yücel
dc.contributor.kuauthorSezgin, Tevfik Metin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileMaster Student
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.contributor.yokid107907
dc.contributor.yokid18632
dc.date.accessioned2024-11-09T23:46:59Z
dc.date.issued2014
dc.description.abstractAffect bursts, which are nonverbal expressions of emotions in conversations, play a critical role in analyzing affective states. Although there exist a number of methods on affect burst detection and recognition using only audio information, little effort has been spent for combining cues in a multi-modal setup. We suggest that facial gestures constitute a key component to characterize affect bursts, and hence have potential for more robust affect burst detection and recognition. We take a data-driven approach to characterize affect bursts using Hidden Markov Models (HMM), and employ a multimodal decision fusion scheme that combines cues from audio and facial gestures for classification of affect bursts. We demonstrate the contribution of facial gestures to affect burst recognition by conducting experiments on an audiovisual database which comprise speech and facial motion data belonging to various dyadic conversations.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.identifier.doi10.1109/SIU.2014.6830552
dc.identifier.isbn9781-4799-4874-1
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-84903753005&doi=10.1109%2fSIU.2014.6830552&partnerID=40&md5=6768ae3171193421a7bf5dc6ea34a0f0
dc.identifier.scopus2-s2.0-84903753005
dc.identifier.urihttp://dx.doi.org/10.1109/SIU.2014.6830552
dc.identifier.urihttps://hdl.handle.net/20.500.14288/14052
dc.identifier.wos356351400381
dc.keywordsAffect burst
dc.keywordsMultimodal recognition
dc.languageTurkish
dc.publisherIEEE Computer Society
dc.source2014 22nd Signal Processing and Communications Applications Conference, SIU 2014 - Proceedings
dc.subjectCivil engineering
dc.subjectElectrical electronics engineering
dc.subjectTelecommunication
dc.titleAffect burst recognition using multi-modal cues
dc.title.alternativeÇok-kipli ipuçlari kullanarak duygusal patlama tanıma
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.authorid0000-0002-7515-3138
local.contributor.authorid0000-0002-1524-1646
local.contributor.kuauthorTürker, Bekir Berker
local.contributor.kuauthorMarzban, Shabbir
local.contributor.kuauthorErzin, Engin
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorSezgin, Tevfik Metin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files