AI Seminar 2004/2005

(please see the main page for schedule information)

Speaker: Jeff Cohn

Machine Analysis of Emotion Expression and Paralinguistic Communication


Both the configuration and the timing of facial actions are important in emotion expression and recognition. The configuration of facial actions in relation to emotion, communicative intent, and action tendencies has been a major research topic. Less is known about the timing of facial actions in part because existing measurement methods, which rely upon human observers, are relatively coarse and time consuming to use. To investigate the timing of facial actions, we developed an automatic facial image analysis (AFA) system that has high consistency with ground truth measures of rigid (head movement) and non-rigid (expression) motion. In a series of studies, we have begun to investigate the timing of facial actions, head motion, and direction of gaze in adults in solitary and social contexts and in mothers and infants during face-to-face interaction. Facial action, as indicated by lip-corner displacement during spontaneous smiles is consistent with ballistic timing and moderately correlated with head and eye motion, as suggested by neuroscience literature. Specific patterns of correlation appear specific to interpersonal context and communicative intent. Our findings point to the hypothesis that the communicative meaning of similar facial actions may be disambiguated by attending to specific patterns of facial action dynamics and their coordination with head rotation and gaze.

Speaker Bio

Jeffrey Cohn is Associate Professor of Psychology and Psychiatry at the University of Pittsburgh and Adjunct Faculty at the Robotics Institute, Carnegie Mellon University. He earned his PhD in Clinical Psychology from the University of Massachusetts in Amherst and completed Clinical Internship at the University of Maryland Medical Center. For the past 20 years, he has conducted investigations in the theory and science of emotion, depression, and nonverbal communication. He has co-led interdisciplinary and inter-institutional efforts to develop methods of automated analysis of facial expression and prosody and applied these tools to research in human emotion, communication, biometrics, and human-computer interaction. He has published over 90 papers on these topics. His research has been supported by grants from the National Institute of Mental Health, the National Institute of Child Health and Human Development, the National Science Foundation, and the Defense Advanced Research Projects Agency.

Maintainer is
Patrick Riley
Last modified: Tue Oct 5 11:18:50 EDT 2004