15-494 Cognitive Robotics
Spring 2010
 Course Links:  Main Lectures Labs Homeworks Tekkotsu Links:   Tutorial Resources Wiki Bugs CVS Reference:

# Cognitive Robotics: Lab 8/Homework 7

## Part I: Forward Kinematics: Foot to Gripper Distance

This part of the lab demonstrates forward kinematics calculations. We will use `linkToBase()` to calculate the location of the gripper in base frame coordinates, and the same for the right front foot. We can measure the distance between the foot and the gripper by subtracting these two vectors and taking the norm of the result. Read the source code for FootToGripper.cc.fsm, then run the demo from Root Control > Framework Demos > Kinematics Demos. How accurate are the distance estimates?

With the demo running, move the arm joints so that the arm points straight out, and then rotate the shoulder so that the gripper's y coordinate is as close to zero as possible. Based on the reported x and z coordinates, where do you think the origin of the base frame is located?

## Part II: Inverse Kinematics: Staring at the Gripper

In this part we'll perform an inverse kinematics calculation to make the camera track the gripper as you move the arm around. Read the code for CameraTrackGripper.cc.fsm. Then run the demo from Root Control > Framework Demos > Kinematics Demos. Turn on the RawCam viewer and move the arm around. Watch how the `linkToBase()` transformation matrix for the gripper changes as the arm moves.

## Part III: Sorting Objects

Start on this now, and finish it for homework. Your task is to have the robot sort easter eggs based on color. There will be two lines made from colored tape. If the easter egg is blue, it should be pushed across the blue line; it it's pink, it should be pushed across the pink line. Your program shouldn't depend on the lines being in any specific location, but you can assume that they will be somewhere in front of and to the right of the robot's midline.

Start by moving the arm to a position off to the far left, but with the gripper fingers facing to the right. Position the camera so that it can see the gripper in the lower right corner of the image. When the user presses the green button, use the MapBuilder to construct a local map containing both ellipses and lines. Verify that there is an ellipse inside the gripper. (Hint: since the base frame is also the origin of the local map, calculate the gripper's coordinates in the base frame, get the centroid of the ellipse, and make sure they are reasonably close to each other.) If you don't have a viable ellipse and an appropriate line, complain and return to the start to await the next button press.

You will need to calculate a target position for the arm to reach toward. Start by computing the midpoint of the correct colored line. Calculate a vector from the base frame to this midpoint, and extend the vector further by 50 mm. That is your target position.

You can use an ArmNode to move the arm. ArmMC provides a convenient moveToPoint method. Note that it returns false if there is no kinematic solution for the target point. If this occurs, your code should report the failure. You can use `sndman->speak("...")` to do so.

After pushing the easter egg across the line, return the arm to its starting position and wait for the user to supply the next easter egg.