Publication: Estimation of personalized facial gesture patterns
dc.contributor.department | Department of Electrical and Electronics Engineering | |
dc.contributor.department | Department of Computer Engineering | |
dc.contributor.department | Graduate School of Sciences and Engineering | |
dc.contributor.kuauthor | Erzin, Engin | |
dc.contributor.kuauthor | Ofli, Ferda | |
dc.contributor.kuauthor | Tekalp, Ahmet Murat | |
dc.contributor.kuauthor | Yemez, Yücel | |
dc.contributor.schoolcollegeinstitute | College of Engineering | |
dc.contributor.schoolcollegeinstitute | GRADUATE SCHOOL OF SCIENCES AND ENGINEERING | |
dc.date.accessioned | 2024-11-09T22:59:00Z | |
dc.date.issued | 2007 | |
dc.description.abstract | We propose a framework for estimation and analysis of temporal facial expression patterns of a speaker. The goal of this framework is to learn the personalized elementary dynamic facial expression patterns for a particular speaker. We track lip, eyebrow, and eyelid of the speaker in 3D across a head-andshoulder stereo video sequence. We use MPEG-4 Facial Definition Parameters (FDPs) to create the feature set, and MPEG4 Facial Animation Parameters (FAPs) to represent the temporal facial expression patterns. Hidden Markov Model (HMM) based unsupervised temporal segmentation of upper and lower facial expression features is performed separately to determine recurrent elementary facial expression patterns for the particular speaker. These facial expression patterns, which are coded by FAP sequences and may not be tied with prespecified emotions, can be used for personalized emotion estimation and synthesis of a speaker. Experimental results are presented. | |
dc.description.indexedby | Scopus | |
dc.description.indexedby | WOS | |
dc.description.openaccess | YES | |
dc.description.publisherscope | International | |
dc.description.sponsoredbyTubitakEu | N/A | |
dc.identifier.doi | 10.1109/SIU.2007.4298615 | |
dc.identifier.isbn | 1424-4071-92 | |
dc.identifier.isbn | 9781-4244-0719-4 | |
dc.identifier.quartile | N/A | |
dc.identifier.scopus | 2-s2.0-50249149810 | |
dc.identifier.uri | https://doi.org/10.1109/SIU.2007.4298615 | |
dc.identifier.uri | https://hdl.handle.net/20.500.14288/7819 | |
dc.identifier.wos | 252924600149 | |
dc.keywords | Animation | |
dc.keywords | Estimation | |
dc.keywords | Feature extraction | |
dc.keywords | Hidden Markov models | |
dc.keywords | Imaging techniques | |
dc.keywords | Markov processes | |
dc.keywords | Motion Picture Experts Group standards | |
dc.keywords | Parameter estimation | |
dc.keywords | Photography | |
dc.keywords | Signal processing | |
dc.keywords | Three dimensional | |
dc.keywords | Video recording | |
dc.keywords | Dynamic facial expression | |
dc.keywords | Emotion estimation | |
dc.keywords | Facial Definition Parameters | |
dc.keywords | Facial expressions | |
dc.keywords | Feature sets | |
dc.keywords | Hidden-Markov model | |
dc.keywords | MPEG4 facial animation | |
dc.keywords | Stereo video | |
dc.keywords | Temporal segmentations | |
dc.keywords | Face recognition | |
dc.language.iso | tur | |
dc.publisher | IEEE | |
dc.relation.ispartof | 2007 IEEE 15th Signal Processing and Communications Applications, SIU | |
dc.subject | Electrical electronics engineering | |
dc.subject | Computer engineering | |
dc.title | Estimation of personalized facial gesture patterns | |
dc.title.alternative | Kişiselleştirilmiş yüz jest örüntülerinin kestirimi | |
dc.type | Conference Proceeding | |
dspace.entity.type | Publication | |
local.contributor.kuauthor | Tekalp, Ahmet Murat | |
local.contributor.kuauthor | Erzin, Engin | |
local.contributor.kuauthor | Yemez, Yücel | |
local.contributor.kuauthor | Ofli, Ferda | |
local.publication.orgunit1 | College of Engineering | |
local.publication.orgunit1 | GRADUATE SCHOOL OF SCIENCES AND ENGINEERING | |
local.publication.orgunit2 | Department of Electrical and Electronics Engineering | |
local.publication.orgunit2 | Department of Computer Engineering | |
local.publication.orgunit2 | Graduate School of Sciences and Engineering | |
relation.isOrgUnitOfPublication | 21598063-a7c5-420d-91ba-0cc9b2db0ea0 | |
relation.isOrgUnitOfPublication | 89352e43-bf09-4ef4-82f6-6f9d0174ebae | |
relation.isOrgUnitOfPublication | 3fc31c89-e803-4eb1-af6b-6258bc42c3d8 | |
relation.isOrgUnitOfPublication.latestForDiscovery | 21598063-a7c5-420d-91ba-0cc9b2db0ea0 | |
relation.isParentOrgUnitOfPublication | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 | |
relation.isParentOrgUnitOfPublication | 434c9663-2b11-4e66-9399-c863e2ebae43 | |
relation.isParentOrgUnitOfPublication.latestForDiscovery | 8e756b23-2d4a-4ce8-b1b3-62c794a8c164 |