Publication:
Affect recognition from lip articulations

dc.contributor.departmentN/A
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.kuauthorSadiq, Rizwan
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuprofilePhD Student
dc.contributor.kuprofileFaculty Member
dc.contributor.otherDepartment of Computer Engineering
dc.contributor.schoolcollegeinstituteGraduate School of Sciences and Engineering
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.yokidN/A
dc.contributor.yokid34503
dc.date.accessioned2024-11-09T23:15:06Z
dc.date.issued2017
dc.description.abstractLips deliver visually active clues for speech articulation. Affective states define how humans articulate speech; hence, they also change articulation of lip motion. In this paper, we investigate effect of phonetic classes for affect recognition from lip articulations. The affect recognition problem is formalized in discrete activation, valence and dominance attributes. We use the symmetric KullbackLeibler divergence (KLD) to rate phonetic classes with larger discrimination across different affective states. We perform experimental evaluations using the IEMOCAP database. Our results demonstrate that lip articulations over a set of discriminative phonetic classes improves the affect recognition performance, and attains 3-class recognition rates for the activation, valence and dominance (AVD) attributes as 72.16%, 46.44% and 64.92%, respectively.
dc.description.indexedbyWoS
dc.description.indexedbyScopus
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsorshipThe Institute of Electrical and Electronics Engineers Signal Processing Society
dc.identifier.doi10.1109/ICASSP.2017.7952593
dc.identifier.isbn9781-5090-4117-6
dc.identifier.issn1520-6149
dc.identifier.linkhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85023774904&doi=10.1109%2fICASSP.2017.7952593&partnerID=40&md5=a181e952a1e72f4f0ed84942266388a0
dc.identifier.scopus2-s2.0-85023774904
dc.identifier.urihttp://dx.doi.org/10.1109/ICASSP.2017.7952593
dc.identifier.urihttps://hdl.handle.net/20.500.14288/10268
dc.identifier.wos414286202122
dc.keywordsAffect recognition
dc.keywordsEmotion recognition
dc.keywordsKullbackLeibler divergence
dc.keywordsLip articulations
dc.keywordsPhoneme
dc.languageEnglish
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.sourceICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
dc.subjectAcoustics
dc.subjectElectrical electronic engineering
dc.titleAffect recognition from lip articulations
dc.typeConference proceeding
dspace.entity.typePublication
local.contributor.authoridN/A
local.contributor.authorid0000-0002-2715-2368
local.contributor.kuauthorSadiq, Rizwan
local.contributor.kuauthorErzin, Engin
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication.latestForDiscovery89352e43-bf09-4ef4-82f6-6f9d0174ebae

Files