Facial Expression Analysis

Professor Jeffrey Cohn
Department of Psychology
University of Pittsburgh

Professor Takeo Kanade
Robotics Institute
Carnegie Mellon University

James Lien
Department of Electrical
University of Pittsburgh

Yu-Te Wu
Department of Electrical
University of Pittsburgh

Adena Zlochower
Department of Psychology
University of Pittsburgh

Introduction
Processing Steps
Acknowledgement

Introduction

Facial expression communicates information about emotions, regulates interpersonal behavior and person perception, indexes physiologic functioning, and is essential to evaluating preverbal infants. Current human-observer methods of facial expression analysis are labor intensive and difficult to standardize across laboratories and over time. These factors force investigators to use less specific systems whose convergent validity is often unknown. To make feasible more rigorous, quantitative measurement of facial expression in diverse applications, our interdisciplinary research group, with expertise in facial expression and computerized image processing, is developing automated methods of facial expression analysis. In the methods under development, automatic feature extractors detect and track changes in the facial features of a subject in a digitized image sequence (30 images per second). From the extracted features, a neural network based classifier estimates intensities of FACS (Facial Action Coding System) action units (AUs) in each video image. A user interface will permit investigators to define facial configurations (per EMFACS, FACS Dictionary, MAX, or their own specifications) and generate time series or summary data files for statistical analysis.

Processing Steps

First, optical flow technique is applied to real examples to extract the motion information.

Anger (3.7M mpeg) Disgust (4.9M mpeg)
Fear (3.4M mpeg) Joy (3.9M mpeg)
Sad (2.8M mpeg) Surprise

Acknowledgement

This work is funded by NIMH Grant R01MH51435.

Reference

Cohn, J.F., Zlochower, A., Lien, J., Wu., Y.T., & Kanade, T. (July, 1997). Auto mated face coding: A computer-vision based method of facial expression analysis. 7th European Conference on Facial Expression, Measurement, and Meaning, Salzbur g, Austria.

Cohn, J.F., Kanade, T.K., Wu, Y.T., Lien, J., & Zlochower, A. (August, 1996). Fa cial expression analysis: Preliminary results of a new image-processing based me thod. International Society for Research in Emotion, Toronto.

Yu-Te Wu, "Image Registration Using Wavelet-Based Motion Model And Its Applications", Ph.D. thesis, Dept. of EE , University of Pittsburgh, September, 1997.

Yu-Te Wu, Kanade T., Cohn, J.F. and Li, C.C., ``Optical Flow Estimation Using Wavelet Motion Model'', ICCV'98.


Last updated: September 12th, 1997

Yu-Te Wu (ytw@cs.cmu.edu)