VASC Seminar Announcement ========================= Date: Monday, 9/27/99 Time: 3:30-4:30 Place: Smith Hall 2nd Floor Common Area Speaker: Jie Yang CMU Human-Computer Interaction Institute http://www.cs.cmu.edu/~yang Title: Real-Time Tracking for Multimodal HCI Abstract: While multimodal interfaces offer greater flexibility and robustness, they have been largely pen/voice-based, user activated, and operated in settings where some constraining devices are required. In this talk, I will discuss why real-time tracking is important in a multimodal interface and how to improve human computer interaction by visual information. First, I focus on techniques of tracking human faces and facial features. I address two important issues in real-time face tracking: what to track and how to track. I demonstrate a real-time face tracker that runs on a low-end workstation or a PC (pentium 233 or above) with a framegrabber and a camera. I present a top-down approach to tracking facial features. Based on the face tracker, facial features such as eyes, nostrils, and lip corners, can be efficiently located within the facial area. I describe applications of face/facial feature tracking to tele-conference and multimodal human computer interaction. I conclude my talk by introducing applications of real-time tracking in an ongoing project - Meeting Room of Future. Biosketch: Jie Yang is a Research Computer Scientist in the HCI Institute at Carnegie Mellon university, where his main research interest is multimodal human computer interaction, computer vision, and pattern recognition. Dr. Yang is leading a research group in the Interactive Systems Lab to develop systems that perceive human activities in the front of a computer and a meeting room, and assist a human in a mobile environment.