0
Research Papers

Facial Expression Analysis for Content-Based Video Retrieval

[+] Author and Article Information
P. Geetha

Research Scholar
Computer Science Engineering Department,
Sathyabama University,
Chennai 600109, India
e-mail: geethap@annauniv.edu

Vasumathi Narayanan

Professor
Electronics and Communication
Engineering Department,
St. Joseph's College of Engineering,
Chennai 600109, India
e-mail: vasumathin@yahoo.com

1Corresponding author.

Contributed by the Computers and Information Division of ASME for publication in the JOURNAL OF COMPUTING AND INFORMATION SCIENCE IN ENGINEERING. Manuscript received June 7, 2011; final manuscript received May 27, 2014; published online September 1, 2014. Editor: Bahram Ravani.

J. Comput. Inf. Sci. Eng 14(4), 041001 (Sep 01, 2014) (6 pages) Paper No: JCISE-11-1351; doi: 10.1115/1.4027885 History: Received June 07, 2011; Revised May 27, 2014

In this work, we propose a technique for facial expression recognition to bridge the semantic gap among the features that can be extracted in a content-based video retrieval system. The paper aims to provide accurate and reliable facial expression recognition of a dominant person in video frames using deterministic binary cellular automata (DBCA). Both geometric and appearance-based features are used. Efficient dimension reduction techniques for face detection and recognition are applied. Using the facial action coding system (FACS), one can code automatically nearly any anatomically possible facial expression, deconstructing it into what are called as action units (AUs). By employing two-dimensional deterministic binary cellular automaton systems (2D-DBCA), a scheme is developed to classify the facial expressions representing various emotions to retrieve video scenes/shots. Extensive experiments on Cohn–Kanade database, Yale database, and large movie videos show the superiority of the proposed method, in comparison with support vector machines (SVMs), hidden Markov models (HMMs), and neural network (NN) classifiers.

FIGURES IN THIS ARTICLE
<>
Copyright © 2014 by ASME
Your Session has timed out. Please sign back in to continue.

References

Lisetti, C. L., and Schiano, D. J., 2000, “Automatic Facial Expression on Facial Interpretation: Where Human-Computer Interaction, AI and Cognitive Science Intersect,” Pragmatics Cognit., 8(1), pp. 185–235. [CrossRef]
Bente, G., Krämer, N. C., and Eschenburg, F., 2008, “Is There Anybody Out There? Analyzing the Effects of Embodiment and Nonverbal Behavior in Avatar-Mediated Communication,” Mediated Interpersonal Communication, Routledge, New York, pp. 131–157.
Ahn, S. J., Jabon, M. E., and Bailenson, J. N., 2009, “Judging a Book by the Cover: Using Facial Expressions to Predict Performance,” 59th Annual International Communication Association Conference, May 21–25, Chicago, IL.
Matsumoto, D., and Willingham, B., 2009, “Spontaneous Facial Expressions of Emotion in Congenitally and Non-congenitally Blind Individuals,” J. Pers. Soc. Psychol., 96(1), pp. 1–10. [CrossRef] [PubMed]
Mehrabian, A., 1968, “Communication Without Words,” Psychol. Today, 2(4), pp. 53–56.
Deng, H. B., Jin, L. W., Zhen, L. X., and Huang, J. C., 2005, “A New Facial Expression Recognition Method Based on Local Gabor Filter Bank and PCA Plus LDA,” Int. J. Inf. Technol., 11(11), pp. 86–96.
Patnic, M., and Rothkrantz, J., 2000, “Automatic Analysis of Facial Expressions: The State of Art,” IEEE Trans. Pattern Anal. Mach. Intell., 22(12), pp. 1424–1445. [CrossRef]
Singh, S. Kr., Chauhan, D. S., Vatsa, M., and Singh, R., 2003, “A Robust Skin Color Based Face Detection Algorithm,” Tamkang J. Sci. Eng., 6(4), pp. 227–234.
Belhumeur, P. N., Hespanha, J. P., and Kriegman, D. J., 1997, “Eigen Faces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,” IEEE Trans. Pattern Anal. Mach. Intell., 19(7), pp. 711–720. [CrossRef]
Fasel, B., and Luettin, J., 2003, “Automatic Facial Expression Analysis: A Survey,” Pattern Recognit., 36(1), pp. 259–275. [CrossRef]
Ekman, P., and Friesen, W., 1978, The Facial Action Coding System: A Technique for the Measurement of Facial Movement Consulting, Psychologist Press, San Francisco, CA.
Donato, G., Bartlett, M. S., Hager, J. C., Ekman, P., and Sejnowski, T. J., 1999, “Classifying Facial Actions,” IEEE Trans. Pattern Anal. Mach. Intell., 21(10), pp. 974–989. [CrossRef] [PubMed]
Nayak, B. K., Sahoo, S., and Rout, S. K., 2008, “Color Graphs: An Efficient Model for Two Dimensional Cellular Automata Linear Rules,” 2008 Orissa Mathematical Society Conference, pp. 1–14.
Wolfram, S., 1983, “Statistical Mechanics of Cellular Automata,” Rev. Mod. Phys., 55(3), pp. 601–644. [CrossRef]
Cohen, I., Sebe, N., Cozman, F., Cirelo, M., and Huang, T., 2003, “Coding, Analysis, Interpretation, and Recognition of Facial Expressions,” J. Comput. Vision Image Understanding, (Special Issue).
Fawcett, T., 2008, “Data Mining With Cellular Automata,” SIGKDD, 10(1), pp. 32–39. [CrossRef]
Khademi, M., Kiapour, M. H., Manzuri-Shalmani, M. T., and Kiaei, A. A., 2010, “Analysis, Interpretation, and Recognition of Facial Action Units and Expressions Using Neuro-Fuzzy Modeling,” ANNPR'10, 4th IAPR TC3 Conference on Artificial Neural Networks in Pattern Recognition, Springer-Verlag, Berlin, Heidelberg, pp. 161–172, Paper No. LNAI 5998.
Tian, Y., Kanade, T., and Cohn, F., 2001, “Recognizing Action Units for Facial Expression Analysis,” IEEE Trans. Pattern Anal. Mach. Intell., 23(2), pp. 97–115. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

(a) Fiducial point set on given face and (b) block diagram of proposed work

Grahic Jump Location
Fig. 4

Steps to be followed in analysis and recognition of respective AUs using matlab

Grahic Jump Location
Fig. 5

Precision–recall comparison of our proposed work

Grahic Jump Location
Fig. 6

ROC graphs for recognition of AU 1 using proposed method, obtained by adjusting the threshold value ε

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In