Department of Computer Engineering2024-11-1020159781-4673-7386-910.1109/SIU.2015.71300022-s2.0-84939146695http://dx.doi.org/10.1109/SIU.2015.7130002https://hdl.handle.net/20.500.14288/17513Recently, affect bursts have gained significant importance in the field of emotion recognition since they can serve as prior in recognising underlying affect bursts. In this paper we propose a data driven approach for detecting affect bursts using multimodal streams of input such as audio and facial landmark points. The proposed Gaussian Mixture Model based method learns each modality independently followed by combining the probabilistic outputs to form a decision. This gives us an edge over feature fusion based methods as it allows us to handle events when one of the modalities is too noisy or not available. We demonstrate robustness of the proposed approach on 'Interactive emotional dyadic motion capture database' (IEMOCAP) which contains realistic and natural dyadic conversations. This database is annotated by three annotators to segment and label affect bursts to be used for training and testing purposes. We also present performance comparison between SVM based methods and GMM based methods for the same configuration of experiments.Computer engineeringAffect burst detection using multi-modal cuesÇok-kipli modelleme ile duygusal patlama sezimiConference proceedinghttps://www.scopus.com/inward/record.uri?eid=2-s2.0-84939146695anddoi=10.1109%2fSIU.2015.7130002andpartnerID=40andmd5=3f4d4326fe09b48e6e4f13d445d57388380500900232N/A7750