Publication:
Affect burst detection using multi-modal cues

dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentN/A
dc.contributor.kuauthorSezgin, Tevfik Metin
dc.contributor.kuauthorYemez, Yücel
dc.contributor.kuauthorTürker, Bekir Berker
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorMarzban, Shabbir
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.kuprofileMaster Student
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.yokid18632
dc.contributor.yokid107907
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.contributor.yokidN/A
dc.date.accessioned2024-11-10T00:11:37Z
dc.date.issued2015
dc.description.abstractRecently, affect bursts have gained significant importance in the field of emotion recognition since they can serve as prior in recognising underlying affect bursts. In this paper we propose a data driven approach for detecting affect bursts using multimodal streams of input such as audio and facial landmark points. The proposed Gaussian Mixture Model based method learns each modality independently followed by combining the probabilistic outputs to form a decision. This gives us an edge over feature fusion based methods as it allows us to handle events when one of the modalities is too noisy or not available. We demonstrate robustness of the proposed approach on 'Interactive emotional dyadic motion capture database' (IEMOCAP) which contains realistic and natural dyadic conversations. This database is annotated by three annotators to segment and label affect bursts to be used for training and testing purposes. We also present performance comparison between SVM based methods and GMM based methods for the same configuration of experiments.
dc.description.indexedbyScopus
dc.description.indexedbyWoS
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuTÜBİTAK
dc.identifier.doi10.1109/SIU.2015.7130002
dc.identifier.isbn9781-4673-7386-9
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-84939146695anddoi=10.1109%2fSIU.2015.7130002andpartnerID=40andmd5=3f4d4326fe09b48e6e4f13d445d57388
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-84939146695
dc.identifier.urihttp://dx.doi.org/10.1109/SIU.2015.7130002
dc.identifier.urihttps://hdl.handle.net/20.500.14288/17513
dc.identifier.wos380500900232
dc.keywordsAffect burst detection
dc.keywordsAffective computing and interaction
dc.keywordsApplied machine learning data streams
dc.keywordsGaussian distribution
dc.keywordsAffective computing
dc.keywordsApplied machine learning
dc.keywordsBurst detection
dc.keywordsData-driven approach
dc.keywordsGaussian Mixture Model
dc.keywordsPerformance comparison
dc.keywordsProbabilistic output
dc.keywordsTraining and testing
dc.keywordsSignal processing
dc.languageTurkish
dc.publisherIEEE
dc.source2015 23rd Signal Processing and Communications Applications Conference, SIU 2015 - Proceedings
dc.subjectComputer engineering
dc.titleAffect burst detection using multi-modal cues
dc.title.alternativeÇok-kipli modelleme ile duygusal patlama sezimi
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authorid0000-0002-1524-1646
local.contributor.authorid0000-0002-7515-3138
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.authoridN/A
local.contributor.kuauthorSezgin, Tevfik Metin
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorTürker, Bekir Berker
local.contributor.kuauthorErzin, Engin
local.contributor.kuauthorMarzban, Shabbir
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files