Interactive Robot Programming

As robots enter the human environment and come in contact with inexperienced users, they need to be able to interact with users in intuitive fashion - keyboard and mouse are no longer acceptable as the only input modalities. Humans should be able to communicate with robots using methods as similar as possible to the concise, rich, and diverse means they use to communicate with one another. Cooperation among humans is multi-modal and often intuitive, whereas current methods of cooperation among robots and between robots and humans, whether for programming or control, are generally highly specified and inflexible. Cooperation between human and a robot system involves communication of intent through available mode of interactions. Interactive Robot Programming group in the AML works towards a comprehensive human-machine interface that allows non-experts to conveniently compose software for and cooperate with robot systems.

Projects:

Gesture Based Control of a Mobile Robot
Multimodal Control of Mobile Robots
Interactive Control of a Manipulator
Gesture Based Programming

Back to the top

GESTURE BASED CONTROL OF A MOBILE ROBOT

Compared to a mouse and keyboard, hand gestures have high redundancy to convey geometric and temporal data. They are rich in vocabulary while being intuitive to users. These features make hand gestures an attractive tool to interact with robots. For this particular system, we have experimented with the use of single-handed gesture to control a single mobile robot. We have developed a gesture spotting and recognition algorithm based on a Hidden Markov Model (HMM) for this system.  

Project Web Site

Project Contact: Pradeep K. Khosla

Back to the top

MULTIMODAL CONTROL OF MOBILE ROBOTS

We are currently investigating the multi-modal interaction scenario, where mobile robots are controlled through two-handed gestures and speech. Multi-modality comes in handy when the user needs to teach a new skill and compose programs out of basic skill sets known as primitives. We are using HTK (Hidden Markov Model Toolkit), and off-the-shelf software component for gesture and speech recognition. Cye, the mobile robot, is used as a test-bed.  

Project Web Site

Project Contact: Pradeep K. Khosla

Back to the top

INTERACTIVE CONTROL OF A MANIPULATOR

Another experiment is conducted to show the usefulness of hand gesture as an interaction mode for a robotic system. The manipulator with pre-defined geometric primitives performs a drawing task, while the user can play around with their parameter through hand gestures. The system will be extended into a functional interactive programming system by providing extra modalities (such as speech, tactile feedback) to compose and modify primitives.  

Project Web Site

Project Contact: Pradeep K. Khosla

Back to the top

GESTURE BASED PROGRAMMING

Gesture-Based Programming is a new form of programming by human demonstration that views the demonstration as a series of inexact gestures that convey the intention of the task strategy, not the details of the strategy itself. This is analogous to the type of programming that occurs between human teacher and student and is more intuitive for both. However, it requires a shared ontology between teacher and student -- in the form of a common skill database -- to abstract the observed gestures to meaningful intentions that can be mapped onto previous experiences and previously-acquired skills. 

Project Web Site

Project Contact: Pradeep K. Khosla

Back to the top





Last modified: Mon Aug 20 19:06:20 GMT 2001