15-494 Cognitive Robotics
Spring 2012
Course Links:  Main Lectures Labs Homeworks

Tekkotsu Links:   Wiki Tutorial Resources Bugs CVS
Reference:

Cognitive Robotics: Lab 7 & Homework 6


In this lab you will explore the Calliope's kinematic structure and Tekkotsu's forward and inverse kinematics functions.

Part I: CameraTrackGripper Demo

  1. Do this step using Mirage. Go to Root Control > Framework Demos > Kinematics Demos > CameraTrackGripper. Turn on the RawCam viewer and the Arm Controller. Move the arm and note that the left finger remains centered in the camera image. Type "msg" to the Tekkotsu console a few times to display the finger-to-base transformation matrix and observe how it changes as you move the arm.
  2. Now run CameraTrackGripper on the robot, but turn off the Arm Controller. Instead, press the Play button on the Create and the robot will say "Arm relaxed". (If it doesn't say this, press the button more firmly; make sure the sound is turned up.) Now take hold of the arm and move it around; the camera will follow.
  3. Notice that the motion of the head is jerky when the RawCam viewer is active. What happens when you turn off the RawCam viewer?
  4. Notice also that with the RawCam active, the left finger is not centered in the camera image. The gripper may not be visible at all. Why is the camera so far off target on the real robot, when it was dead-on in Mirage? The reason is that we're using the kinematic description of a Calliope5SP robot, which has a shorter neck and different camera than the Calliope5KP.
  5. Creating a fully correct kinematic description of the Calliope5KP is beyond the scope of this lab. But you can make things somewhat better by changing the D-H parameters for the NECK:pan joint to reflect its true position relative to the base frame, which is in the center of the Create at a height of 0 above the ground. Measure the pan servo position relative to the base frame and edit the file project/ms/config/Calliope5KP.kin to change the d and r values for the NECK:pan joint. (Search for NECK:pan in the file and you'll see the D-H parameters below it.) Note that if you do your editing on the workstation you will need to use sendmyfile to install the file in the /home/user/project/ms/config directory on the robot.

Part II: GripperTrackCamera demo

  1. Do this step in Mirage. Run the GripperTrackCamera demo. Make sure the Arm Controller is not running or it will fight with the demo. Use the Head Controller to move the head and notice that the arm moves appropriately to keep the left finger 100 mm from the camera.
  2. You should also notice that when the demo starts up, the arm moves to its initial position at a very high rate of speed. This is dangerous. The robot could hurt itself or a person. To experiment with this, stop the behavior, use the Arm Controller to move the gripper to a position far from the camera, shut down the Arm Controller, and run the behavior again. Think about what could happen if someone's face was close to the arm when the demo started up.
  3. Copy the demo to your project directory, and rename it GripperTrackCamera2.cc.fsm. Also edit the class name where it appears twice in the file: once in the $nodeclass and once in the REGISTER_BEHAVIOR.
  4. We're going to make the demo safe by checking the distance that the arm needs to travel to reach its target position. If the distance is large, we'll use the setMaxSpeed method of PostureMC to slow down the arm. Thus, on startup, the arm will move gently to its initial position. Once there, it can move at high speed as long as it's only making small corrections. Implement this change and test it in Mirage to make sure the arm behaves safely.
  5. Once you're convinced your code is safe, test it on the robot.

Part III: Holding Hands

Write a behavior that allows you to lead the robot around by the hand.
  1. Start by moving the arm to a posture where the gripper is sticking straight out. You can use a PostureNode for this, with the setOuputCmd method to set joint angles directly rather than loading a posture file.
  2. In your state machine, leave the posture node active so it holds the arm in this fixed position.
  3. Even with the posture node active, you can displace the arm slightly by grasping it and gently pulling or pushing. This is how we will guide the robot.
  4. Write code to calculate the position of the gripper frame and notice when it is displaced from its intended position. For displacements above a threshold, have the robot walk forward or back, or turn left or right, for as long as the displacement persists. Make the speed proportional to the magnitude of the displacement. You'll want to use a WalkNode and the setTargetVelocity method for this, rather than a PilotNode, so you can continually change the velocity in a tight sensor loop, like the other kinematics demos we've examined in this lab. You need to keep the WalkNode active, though; you can't keep exiting and reentering it because that will not leave the walk active enough to produce any motion. Instead you should define a subclass of WalkNode that does:
      virtual void doStart() {
        subscribe to sensorEGID events
      }
    
      virtual void doEvent() {
        read the gripper position and call setTargetVelocity to
        change the walk's velocity while it's running
      }
    

Part IV: Continuous Trajectories

  1. Write a behavior that moves the gripper to a point slightly in front of the robot, and then moves it, at a low speed, to a point a bit a good bit further ahead, i.e., farther in the x direction but with the y and z coordinates the same. You can do this with a series of two PostureNodes.
  2. Run your behavior, and look at the trajectory of the gripper. Although the start and end points should have the same y and z values, the gripper's height above the ground will not remain constant throughout the trajectory. Why not?
  3. Let's try a different approach: we can divide the trajectory into small steps and do a separate IK calculation for each step. This way we can get the trajectory to approximate a straight line. Use a DynamicMotionSequenceNode for this. Inside the doStart, create a PostureEngine instance. You can call its solveLinkPosition method to calculate a set of arm joint angles, and then call the MotionSequenceEngine's setPos method to copy this posture into the motion sequence. Do this in a loop as you interpolate the target position from the start to the end point. When you've calculated the complete motion sequence and returned from the doStart, the DynamicMotionSequence node will play the sequence. What does the trajectory look like?

Part V: Solving for Orientation

Do this part if you are taking the graduate version of this class (15-694).

  1. In your ~/project directory, type "make tekkotsu-HANDEYE" to build Tekkotsu for the Planar Hand/Eye system. Then run it using Mirage and play with the arm controller. Notice that with the current interface you can click and drag on the red dot to control the location of the end-effector, but not its orientation.
  2. The PostureEngine class includes several IK solver methods. So far we've only used solveLinkPosition. A more general method is solveLink, which can solve for a position and orientation simultaneously. Orientations are described using quaternions, which are four-dimensional complex numbers, but you don't have to worry about the math because there is a nice built-in function for computing quaternions for rotating about the z-axis. The demo program LinkOrient.cc.fsm illustrates this. Copy it to your ~/project directory, compile it, and run it. Note: turn off the Arm Controller first, or the demo won't work.
  3. Create a behavior for the Hand/Eye robot to demonstrate control of gripper orientation. Your behavior should start by tilting the camera down and looking for a small ellipse in the Mirage world. Once it knows the location of the ellipse, it should move the gripper to a position 100 mm from the ellipse, with the fingers oriented toward the center of the ellipse. If you then start up the Head Controller and move the head pan joint back and forth, the gripper should rotate around the ellipse, keeping the fingers pointed toward the ellipse center.
  4. The arm won't be able to go all the way around the ellipse, but 180° of travel will be plenty. The ability of the arm to do this will depend on the ellipse location; pick an initial location for the ellipse that permits your demo to work.
  5. Demonstrate the generality of your approach by stopping your behavior, starting the Arm Controller, and using the arm to move the ellipse to a different location. Then stop the Arm Controller and run your behavior again.

What to Hand In

Hand in your modified .kin file and the source code for each part. Due Friday, March 23.


Dave Touretzky and Ethan Tira-Thompson