15-494 Cognitive Robotics
Spring 2007
Course Links:  Main Lectures Labs Homeworks

Tekkotsu Links:   Tutorial Resources Bugs CVS
Reference:

Cognitive Robotics: Lab 8/Homework 7


Part I: Stare At Paw Demo

Run the "stare at paw" demo at Root Control > Mode Switch > Kinematics Demos > New Stare At Paw. Hold and press the button on one of the two front legs; this makes the leg go limp. Now move the leg around and observe that the head moves to keep the paw centered in the camera image. Use the RawCam viewer, and make sure Emergency Stop is off.

Review the source code for this demo, which is in Tekkotsu/Behaviors/Demos/StareAtPawBehavior2.cc.

Part II: Inter-Paw Distance Demo

Compile and run the demo program in the Kinematics chapter of the Tekkotsu tutorial that calculates the distance between the dog's two front paws.

Run the behavior with the dog in Emergency Stop mode so you can move the front legs together and apart. Observe the result.

Part III: Sobriety Test

Get the AIBO to stare at (track by moving the head) a "ball" (or any color easter egg -- I suggest the green ones) using any code you like. (For example, you could use some of the demo code for chasing the pink ball.) As you move the ball around, with the AIBO sitting on its stand, the head should swivel to follow it. Now, every 5 seconds, freeze the heand and move the AIBO's left front paw up so that the toe just touches the lower lip. (Hint: use getJointInterestPoint and solveLinkPosition.) Hold the paw there for 2 seconds, then move the paw back down to a neutral position, and resume tracking the pink ball.

Part IV: Can't Touch This

In this exercise you will have the AIBO support itself on three legs, freeing the fourth leg to tap the top of an orange easter egg half. We have supplied a posture file SITUP.POS to establish the desired posture.
  • Use a PostureMC to put the dog into the desired posture. (The constructor can load the file SITUP.POS.) If the motion command is persistent, the dog will hold that posture.
  • Use a HeadPointerMC to position the dog's head so that it is looking at the area that is reachable by the left front paw. You can use lookAtPoint to stare at a point with coordinates (100, 75, -200).
  • Every 5 seconds, use the mapbuilder to find orange blobs in the camera image and build a local map.
  • Extract the coordinates of the centroid of the largest orange blob in localShS. This will be the target for the leg to reach toward.
  • Use Kinematics::getInterestPoint to calculate the link and effector offset for the left front toe.
  • Use your PostureMC's inverse kinematics solver to calculate how to move the left front toe to the position of the largest orange blob. If the solver returns true, the results of the calculation will be stored in the PostureMC (which is still an active motion command), so the leg should immediately reach to the game piece. Hold the leg there for just one second, then bring it back to a neutral position, raised up out of the way of the camera.
  • If the solver returns false, or if you can't see any orange blobs, have the AIBO bark.
When you run your demo, move the orange easter egg half around and verify that the leg's motions track the object.


Turn in your answers to parts III and IV by March 31.
Dave Touretzky and Ethan Tira-Thompson