Publication:
Estimation of personalized facial gesture patterns

dc.contributor.departmentDepartment of Electrical and Electronics Engineering
dc.contributor.departmentDepartment of Computer Engineering
dc.contributor.departmentGraduate School of Sciences and Engineering
dc.contributor.kuauthorErzin, Engin
dc.contributor.kuauthorOfli, Ferda
dc.contributor.kuauthorTekalp, Ahmet Murat
dc.contributor.kuauthorYemez, Yücel
dc.contributor.schoolcollegeinstituteCollege of Engineering
dc.contributor.schoolcollegeinstituteGRADUATE SCHOOL OF SCIENCES AND ENGINEERING
dc.date.accessioned2024-11-09T22:59:00Z
dc.date.issued2007
dc.description.abstractWe propose a framework for estimation and analysis of temporal facial expression patterns of a speaker. The goal of this framework is to learn the personalized elementary dynamic facial expression patterns for a particular speaker. We track lip, eyebrow, and eyelid of the speaker in 3D across a head-andshoulder stereo video sequence. We use MPEG-4 Facial Definition Parameters (FDPs) to create the feature set, and MPEG4 Facial Animation Parameters (FAPs) to represent the temporal facial expression patterns. Hidden Markov Model (HMM) based unsupervised temporal segmentation of upper and lower facial expression features is performed separately to determine recurrent elementary facial expression patterns for the particular speaker. These facial expression patterns, which are coded by FAP sequences and may not be tied with prespecified emotions, can be used for personalized emotion estimation and synthesis of a speaker. Experimental results are presented.
dc.description.indexedbyScopus
dc.description.indexedbyWOS
dc.description.openaccessYES
dc.description.publisherscopeInternational
dc.description.sponsoredbyTubitakEuN/A
dc.identifier.doi10.1109/SIU.2007.4298615
dc.identifier.isbn1424-4071-92
dc.identifier.isbn9781-4244-0719-4
dc.identifier.quartileN/A
dc.identifier.scopus2-s2.0-50249149810
dc.identifier.urihttps://doi.org/10.1109/SIU.2007.4298615
dc.identifier.urihttps://hdl.handle.net/20.500.14288/7819
dc.identifier.wos252924600149
dc.keywordsAnimation
dc.keywordsEstimation
dc.keywordsFeature extraction
dc.keywordsHidden Markov models
dc.keywordsImaging techniques
dc.keywordsMarkov processes
dc.keywordsMotion Picture Experts Group standards
dc.keywordsParameter estimation
dc.keywordsPhotography
dc.keywordsSignal processing
dc.keywordsThree dimensional
dc.keywordsVideo recording
dc.keywordsDynamic facial expression
dc.keywordsEmotion estimation
dc.keywordsFacial Definition Parameters
dc.keywordsFacial expressions
dc.keywordsFeature sets
dc.keywordsHidden-Markov model
dc.keywordsMPEG4 facial animation
dc.keywordsStereo video
dc.keywordsTemporal segmentations
dc.keywordsFace recognition
dc.language.isotur
dc.publisherIEEE
dc.relation.ispartof2007 IEEE 15th Signal Processing and Communications Applications, SIU
dc.subjectElectrical electronics engineering
dc.subjectComputer engineering
dc.titleEstimation of personalized facial gesture patterns
dc.title.alternativeKişiselleştirilmiş yüz jest örüntülerinin kestirimi
dc.typeConference Proceeding
dspace.entity.typePublication
local.contributor.kuauthorTekalp, Ahmet Murat
local.contributor.kuauthorErzin, Engin
local.contributor.kuauthorYemez, Yücel
local.contributor.kuauthorOfli, Ferda
local.publication.orgunit1College of Engineering
local.publication.orgunit1GRADUATE SCHOOL OF SCIENCES AND ENGINEERING
local.publication.orgunit2Department of Electrical and Electronics Engineering
local.publication.orgunit2Department of Computer Engineering
local.publication.orgunit2Graduate School of Sciences and Engineering
relation.isOrgUnitOfPublication21598063-a7c5-420d-91ba-0cc9b2db0ea0
relation.isOrgUnitOfPublication89352e43-bf09-4ef4-82f6-6f9d0174ebae
relation.isOrgUnitOfPublication3fc31c89-e803-4eb1-af6b-6258bc42c3d8
relation.isOrgUnitOfPublication.latestForDiscovery21598063-a7c5-420d-91ba-0cc9b2db0ea0
relation.isParentOrgUnitOfPublication8e756b23-2d4a-4ce8-b1b3-62c794a8c164
relation.isParentOrgUnitOfPublication434c9663-2b11-4e66-9399-c863e2ebae43
relation.isParentOrgUnitOfPublication.latestForDiscovery8e756b23-2d4a-4ce8-b1b3-62c794a8c164

Files