Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 29644


Select areas to restrict search in scientific publication database:
14909
Facial Expressions Animation and Lip Tracking Using Facial Characteristic Points and Deformable Model
Abstract:
Face and facial expressions play essential roles in interpersonal communication. Most of the current works on the facial expression recognition attempt to recognize a small set of the prototypic expressions such as happy, surprise, anger, sad, disgust and fear. However the most of the human emotions are communicated by changes in one or two of discrete features. In this paper, we develop a facial expressions synthesis system, based on the facial characteristic points (FCP's) tracking in the frontal image sequences. Selected FCP's are automatically tracked using a crosscorrelation based optical flow. The proposed synthesis system uses a simple deformable facial features model with a few set of control points that can be tracked in original facial image sequences.
Digital Object Identifier (DOI):

References:

[1] P. Ekman and W.V. Friesen, Facial Action Coding System (FACS), Consulting Psychologists Press, Inc., 1978.
[2] Z. Wu,P. S. Aleksic and A. K. Katsaggelos," Lip Tracking for MPEG-4 Facial Animation," in Proc. 4th IEEE Int. Conf. on Multimodal Interfaces,ICMI'02,
[3] P. Eisert, T. Wiegand and B. G. Fellow," Model-Aided Coding: A New Approach to Facial Animation into Motion Compensated Video Coding," IEEE Trans. On Circuits and Systems for Video Technology, vol. 10, No. 3, April 2000.
[4] S. T. Worral, A. H. Sadka, and A. M. Kondoz, "3-D Facial Animationfor Very Low Bit Rate Mobile Video," in Proc. IEE Int. Conf. on 3G Mobile Communication Technologies, May 2002.
[5] F. Erol, "Modeling and Animating Personalized Faces," M.Sc. Thesis, Bilkent university, January 2002.
[6] T. Kanade, J. Cohn and Y. Tian. Comprehensive database for facial expression analysis, 2000.
[7] H. Seyedarabi, A. Aghagolzadeh and S. Khanmohammadi, "Facial Expressions Recognition from Static Images using Neural Networks and Fuzzy logic," The 2nd Iranian Conference on Machine Vision and Image processing (MVIP2003),vol.1 pp 7-12, Tehran, 2003.
[8] H. Seyedarabi, A. Aghagolzadeh and S. Khanmohammadi, "Facial Expressions Recognition from Image Sequences using Cross-correlation Based Optical-Flow and Radial Basis Neural Networks," The 12th Iranian Conference on Electrical Engineering,(ICEE 2004), vol.1 pp 165- 170,Mashhad, 2004 .
[9] H. Seyedarabi, A. Aghagolzadeh and S. Khanmohammadi, "Recognition of Six Basic Facial Expressions by Feature-Points Tracking using RBF Neural Networks and Fuzzy Inference System," The IEEE International conference on Multimedia & Expo (ICME2004), Taipei, Taiwan ,June 2004.
[10] Ali Aghagolzadeh, Hadi Seyedarabi and Sohrab Khanmohammadi ," Single and Composite Action Units Classification in Facial Expressions by Feature-Points Tracking and RBF Neural Networks", Ukrainian Int. conf. on signal/Image processing,UkrObraz 2004, October 2004, Kiev, Ukraine.
Vol:13 No:05 2019Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007