[Back to RI Seminar Schedule]


Warning:
This page is provided for historical and archival purposes only. While the seminar dates are correct, we offer no guarantee of informational accuracy or link validity. Contact information for the speakers, hosts and seminar committee are certainly out of date.


RI SEMINAR -- Yasuyoshi Yokokohji



ABSTRACT

Haptic interfaces have been recognized as important input/output channels to/from the virtual environment. Usually a haptic interface is implemented with a visual display interface such as a head-mounted display or a stereoscopic display screen. Correct registration of visual and haptic interfaces, however, is not easy to achieve and has not been seriously considered. For example, some systems have a graphics display simply beside the haptic interface resulting in a ``feeling here but looking there'' situation.

One of the most important potential applications of VR systems is training and simulation. For training visual-motor skills, correct visual/haptic registration is important because a visual-motor skill is composed of tightly coupling visual stimuli (associated with task coordinates) and kinesthetic stimuli (associated with body coordinates). If there is an inconsistency between the two kinds of stimuli, inter-sensory conflicts may occur followed by visuo-motor adoptations. In real performance situations after the training, there would be no significant skill transfer, or in an even worse case, the training might negatively hurt performance in real situations.

In this talk, I propose a new concept of visual/haptic interfaces called a WYSIWYF (What You can See Is What You can Feel) display. The proposed concept ensures correct visual/haptic registration so that what the user can see from the visual interface is consistent with exactly what he/she can feel through the haptic device. In other words, the user's hand can ``encounter'' the haptic device exactly when/where his/her hand touches an object in the virtual environment. A vision-based object tracking technique and a video-keying technique are used to get correct visual/haptic registration. A fast physically-based simulation algorithm has been implemented for haptic rendering.

The first prototype was built using a color liquid crystal display (LCD) panel and a CCD camera for the visual interface component and a PUMA 560 robot for the haptic interface component. The prototype system demo will be shown by video.


BIOGRAPHY

Dr. Yokokohji received Ph.D. in mechanical engineering from Kyoto University in 1991. Since 1992, he has been an associate professor of Department of Mechanical Engineering, Kyoto University. From April, 1994, he has been staying CMU as a visiting research scientist, working with Ralph Hollis and Takeo Kanade. His background is robotics, especially kinematics/dynamics/control and teleoperation. His current interests include haptic interfaces,-based head tracking, and machine-mediated training (Project home page).
Christopher Lee | chrislee@ri.cmu.edu
Last modified: Fri Mar 22 13:50:48 1996