Exploring Tekkotsu Programming on Mobile Robots:

Walking

Prev: Postures and Motion Sequences
Up: Contents
Next: World State

Contents: Walking modes, WalkMC, Ball chase example, Waypoint walk, Walk editor, Walk calibration

Walking modes

Tekkotsu currently provides two walk engines: the CMPack walk engine from Professor Maneula Veloso's 2001 robosoccer team at Carnegie Mellon, and the UPennalizers walk engine from Professor Daniel Lee's 2004 robosoccer team at the University of Pennsylvania. You can experiment with both engines using the TekkotsuMon "Walk Remote Control" interface.

Using the Walk Remote Control

Warning: For safety reasons, place the robot on the floor or in a walled playpen whenever experimenting with the walk engines. AIBOs can move quickly. Diving off the edge of a table will not be good for your robot's health.

  1. Let the robot move: take it out of "emergency stop" mode by clicking on the "Un-Stop" button in the lower right hand corner of the Controller GUI window. You should hear a short bark in response, and the status indicator in the lower left corner of the window should switch from "Stopped" (red) to "Running" (green). If you instead hear a sound like squealing tires, it means you've stopped the AIBO. In that case, hit the button again to un-stop it.

  2. From the ControllerGUI's Root Control menu, select "TekkotsuMon".

  3. From the TekkotsuMon menu, select either "Walk Remote Control" or "UPenn Walk Remote Control".

  4. Use the mouse to drive the robot around.

  5. You can switch between the CMPack and UPenn walk engines by deactivating one and then activating the other using the ControllerGUI. (If you make both active at the same time, they will interfere with each other.)

The CMPack walk engine has a smoother gait, which is helpful if you're doing vision while walking. It has also been modified by the Tekkotsu developers to use a gentle acceleration and decceleration, and to implement a special stepping mode. The UPenn walk engine has a more jerky movement style, but is noticeably faster, and acclerates and deccelerates rapidly.

Both walking engines use a three-component velocity vector:

By default, the Walk Remote Control uses the mouse to control dx and da. Click on the "Horizontal is strafe" button to instead control dx and dy.

The robot walks by moving its legs through a cyclic pattern with a fixed period. It changes velocity by altering its stride length, not the period of the cycle. (Animals and people alter both.) This limitation is apparent if you select an extremely slow forward speed: the robot appears to be jogging in place. Moving to a higher speed causes it to take more normal-sized steps, but the rate of foot plants doesn't change.

The modified CMPack walk also provides a "stepping" mode where you can directly specify how far the robot is to travel and how many steps it is to take. You can experiment with this using the StepTest demo behavior.

StepTest Demo

Warning: Place the robot on the floor or in a walled playpen whenever experimenting with the walk engines.

  1. Take the robot out of "emergency stop" mode by clicking on the "Un-Stop" button in the ControllerGUI.

  2. From the ControllerGUI's Root Control menu, select "Mode Switch".

  3. From the Mode Switch menu, select "StepTest".

  4. In the "Send Input:" box of the ControllerGUI, type the following:
    !msg StepTest  50  0  0  4
    

  5. The parameters to StepTest are dx, dy, da, and nsteps. The above command tells the dog to take 4 steps, each of which is 50 millimeters in the forward direction.

WalkMC motion command

A WalkMC motion command is similar to the other motion commands we have seen, such as LedMC, HeadPointerMC, PostureMC, and MotionSequenceMC. As the motion command is executing, we can use an MMAccessor to modify its parameters and change the dog's speed and direction.

The sample code below causes the AIBO to chase its pink ball. It uses a HeadPointerMC to keep the head pointed at the ball, and a WalkMC to make the dog walk in the direction the head is pointing. All the "smarts" are in processEvent, which is sent a VisionObjectEvent each time the ball is detected. Before going into the details of that, let's get all the routine housekeeping out of the way. Lines shown in red are specific to the WalkMC:

#ifndef INCLUDED_BallChaser_h_
#define INCLUDED_BallChaser_h_

#include "Behaviors/BehaviorBase.h"
#include "Events/EventRouter.h"
#include "Events/VisionObjectEvent.h"
#include "Motion/HeadPointerMC.h"
#include "Motion/MMAccessor.h"
#include "Motion/MotionManager.h"
#include "Motion/WalkMC.h"
#include "Shared/ProjectInterface.h"
#include "Shared/WorldState.h"
#include "Shared/mathutils.h"

using namespace mathutils;

class BallChaser : public BehaviorBase {
 public:
  BallChaser() : BehaviorBase("BallChaser"),
    headpointer_id(MotionManager::invalid_MC_ID),
    walker_id(MotionManager::invalid_MC_ID) {}

  virtual void DoStart() {
    BehaviorBase::DoStart();
    SharedObject<HeadPointerMC> head_mc;
    headpointer_id = motman->addPersistentMotion(head_mc);
    SharedObject<WalkMC> walk_mc;
    walker_id = motman->addPersistentMotion(walk_mc);
    erouter->addListener(this,EventBase::visObjEGID,ProjectInterface::visPinkBallSID);
  }
  
  virtual void DoStop() {
    motman->removeMotion(headpointer_id);
    motman->removeMotion(walker_id);
    BehaviorBase::DoStop();
  }
  
  virtual void processEvent(const EventBase& event)
    { /* Placeholder.  See actual code below */ }

  static std::string getClassDescription()
    { return "Follows ball with head and walks whereever the head is pointing"; }
  virtual std::string getDescription() const { return getClassDescription(); }
    
  protected:
    MotionManager::MC_ID headpointer_id;
    MotionManager::MC_ID walker_id;
};

#endif

Note that the class is called BallChaser instead of DstBehavior. If you want to test out the code, you will need to make the appropriate additions to StartupBehavior_SetupModeSwitch.cc. The only thing interesting in the code above is the definition of getClassDescription. This function returns a string that will be displayed as a tooltip when you place the mouse over the BallChaser menu item in the ControllerGUI. Due to the way C++ treats static functions, we must also explicitly define getDescription here; we cannot use the version inherited from BehaviorBase.

Now we come to the interesting part. If processEvent receives a deactivate type event, or if it receives a status event but the object detected takes up less than one percent of the image area, then the dog last lost sight of the ball and should stop moving. Otherwise, it can calculate the ball's angular position within the image, measure the current pan and tilt of the head, and then move the head so as to center the ball in the image. We define pan and tilt limits to keep from hitting the physical limits of the joints, and to prevent the head from tilting too far down where it might hit the ball as the robot approaches. The expression min(U,max(L,x)) is a common idiom for constraining the value of x to lie within lower bound L and upper bound U.

If the ball is within about 3 degrees of straight ahead, the dog will walk forward at 160 mm/sec. Otherwise it will walk at 100 mm/sec while turning at a rate proportional to the angle by which the ball is off-center. (This angular sensitivity could be adjusted by applying a scale factor to variable new_pan.)

  virtual void processEvent(const EventBase& event) {
    const VisionObjectEvent &visev =
      *reinterpret_cast<const VisionObjectEvent*>(&event);
    float const minBallArea = 0.01;  // one percent of the camera image

    // Stop if we lose sight of the ball.
    if ( visev.getTypeID() == EventBase::deactivateETID ||
	 visev.getObjectArea() < minBallArea ) {
      MMAccessor<WalkMC>(walker_id)->setTargetVelocity(0, 0, 0);
      return;
    };

    float const ball_horiz_angle = visev.getCenterX() * CameraFOV/2;
    float const ball_vert_angle = visev.getCenterY() * CameraFOV/2;

    double const head_pan_angle = state->outputs[HeadOffset+PanOffset];
    double const head_tilt_angle = state->outputs[HeadOffset+TiltOffset];

    double const new_pan = 
      min(deg2rad(80.0), max(deg2rad(-80.0), head_pan_angle - ball_horiz_angle));
    double const new_tilt = 
      min(deg2rad(40.0), max(deg2rad(-20.0), head_tilt_angle - ball_vert_angle));

    // Move the head towards the center of the ball.
    MMAccessor<HeadPointerMC>(headpointer_id)->setJoints(new_tilt, new_pan, 0);

    // Walk in the direction the head is pointing.
    MMAccessor<WalkMC> walker_acc(walker_id);
    if ( fabs(new_pan) < 0.05 )
      walker_acc->setTargetVelocity(160, 0, 0);        // high speed straight ahead
    else
      walker_acc->setTargetVelocity(100, 0, new_pan);  // advance, turning towards ball
  }

Here are a few more details about the vision calculation; it's okay to skip this section on first reading. The functions getCenterX() and getCenterY() return normalized coordinates, which range between -1 and +1. Thus, no matter what camera resolution we're using, the center of the camera image is always at [0.0] in normalized coordinates. On the AIBO, whose camera images are wider than they are tall, the X coordinate ranges from -1 to +1 but the normalized Y coordinate ranges from only about -0.76 to +0.76.

The constant RobotInfo::CameraFOV (actually defined in ERS7Info or ERS210Info) gives the field of view of the camera in radians. For the ERS-7 it is 56.9 degrees, or about 0.99 radians. Since the horizontal field of view ranges from -1 to +1 in camera coordinates, we have to divide CameraFOV by 2 in order to calculate the angular distance of the getCenterX() value from the middle of the image, e.g., the maximum getCenterX() value of 1 would correspond to a point 28 degrees to the right of the midline. The vertical field of view is only 45.2 degrees; this is accomodated by the normalized Y coordinate only going up to about 0.76.

We should point out that the calculations above are just rough approximations. The value of horiz_angle corresponds to the bearing of the ball relative to the dog only when the head is erect, so that the camera plane is perpendicular to the ground plane. When the head is tilted downward to some degree, "horizontal" in the camera image is not purely horizontal in the world. This rough approximation is still good enough to chase the ball because we're using closed loop control: the robot readjusts its motion with every camera frame. See the Forward Kinematics chapter of this tutorial for a discussion of how to calculate the ball position correctly.

Explore more:

  1. Modify BallChaser to use UPennWalkMC instead of WalkMC. What effect does this have on the robot's behavior?

  2. Modify BallChaser to maintain a fixed distance from the ball, based on the getObjectArea() value. If the ball gets too close (object area too large), the dog should move backwards. If it's too far (object area too small), the dog should move forwards. By rolling the ball around with your hand, you should be able to make the dog approach or retreat.

  3. Modify the original version of BallChaser to specify how many steps the dog should take. When the ball is first detected (event type activateETID), have the dog take four steps toward it, of moderate size, and then stop. It should ignore status and deactivate events. But covering the ball with your hand and then uncovering it again should generate four more steps.

  4. Write a behavior that lets you use the tail as a joystick to drive the dog. You'll need to read the tail pan and tilt values, and use them to control the da and dx parameters, respectively, of WalkMC. In order to be able to move the tail with the dog unstopped, you will have to use a PIDMC motion command to tell the AIBO to let the tail go limp. You can do that with the following code fragments:

    
    #include "Motion/PIDMC.h"
    
    MotionManager::MC_ID relax_id;
    
    relax_id = motman->addPersistentMotion(SharedObject<PIDMC>(TailOffset,TailOffset+NumTailJoints,0.0))
    

Waypoint walk

Tekkotsu includes a waypoint walk engine that allows you to specify complex trajectories containing multiple segments of straight or arced paths, with independent control of heading, and options for correction of off-course deviations. For details, see the
WaypointEngine documentation.

You can experiment with the waypoint walk engine and construct prototype waypoint files using the WaypointWalk Control.

Using the WaypointWalk Control

  1. From the "Root Control" menu, double click on "File Access", then select "WaypointWalkControl".

  2. Double click on "Add Egocentric Waypoint". A new menu item named "Waypoint 1" should appear.

  3. Double click on "Waypoint 1" and you will see all the parameters of the waypoint. Click on "X", type the value 0.3 in the "Send Input:" box, and hit return. This tells the dog to move forward by 300 mm. (WaypointWalk uses meters rather than mm for its measurements.)

  4. Click "Back" to go back to the WaypointWalk main menu.

  5. Double click on "Add Egocentric Waypoint" again, and then double click on "Waypoint 2".

  6. For the second waypoint, specify an X distance of 0.5 meters, and an Arc value of 1 radian.

  7. Click "Back" to go back to the WaypointWalk main menu.

  8. Click on "Save Waypoints" and save your waypoints to the file JUNK.WYP.

  9. Make sure looping (the "Loop Waypoints" checkbox in the WaypointWalk Control main menu) is turned off.

  10. Un-stop the AIBO.

  11. Double click on "Execute" and the AIBO should execute the waypoint sequence you created. If something goes wrong and you need to stop the walk, you can use the ControllerGUI's Stop button, or double tap on the AIBO's back button. You can also hit the "Back" button to exit the WaypointWalk Contol, which will cancel the motion command currently executing; it will also lose all your waypoint information, but you can reload it from the file you saved.

The waypoints file is human readable:

#WyP
#add_{point|arc} {ego|off|abs} x_val y_val {hold|follow} angle_val speed_val arc_val
max_turn_speed 0.65
track_path 0
add_point EGO 0.3 0 FOLLOW 0 0.1 0
add_point EGO 0.5 0 FOLLOW 0 0.1 1
#END
File JUNK.WYP

Note: execution of the trajectory you defined is unlikely to be accurate unless the walk has been calibrated for the particular surface you're using. See the "walk calibration" section of this chapter for instructions.

Walk editor

The CMPack walk engine is controlled by about 50 parameters. These specify the timing and joint values to use for the various the phases of each leg plant. By varying these parameters, you can control the AIBO's walking posture and gait. The parameters are stored in a binary file format. The default file is WALK.PRM, which is loaded automatically when a WalkMC is instantiated unless a different file is specified to the WalkNC constructor.

Loading a Walk Parameter File

  1. From the "Root Control" menu, select "File Access".

  2. From the "File Access" menu, select "Walk Edit".

  3. From the "Walk Edit" menu, select "Load Walk".

  4. Tekkotsu provides a few experimental walk parameter files in addition to the standard WALK.PRM. Select the PACE.PRM file and double-click on it to load it.

  5. From the Walk Edit menu, select "WalkControllerBehavior", and un-stop the AIBO. Drive the AIBO (at low speed) to see what the PACE walking style is like. Try both forward motion and turns.

  6. To go back to a normal walk, just select "Load Walk" and reload the file WALK.PRM.

The Walk Editor allows you to interactively vary the walk parameters and make your own walking style, which you can then save for later use.

Developing a New Walk

  1. Go to Root Control > File Access > Walk Walk Edit, and load the file WALK.PRM.

  2. Select "Body" from the walk editor, and then set "Body Height" to 75. This is the height of the body above the ground (in mm).

  3. Use the Back button to go back to the walk editor main menu.

  4. Select "Slow Motion" and change the value from 1 to 2. This controls the speed at which the walk cycles.

  5. Use the Walk Remote Control to drive the dog around, and notice how different this gait is from the normal walk.

To learn more about the walk parameters, see this paper by Chernova and Veloso.

Walk calibration

The BallChaser program is an example of closed loop control: with each new camera frame we recompute the target velocity, so if there are any inaccuracies in the robot's walking, they will be corrected as soon as they are sensed. But sometimes you will want to use open loop control. For example, you may have calculated a path that you want the robot to execute using a WaypointWalk motion command. Or, if you're trying to position the robot relative to some object, you may want to specify a precise displacement that the robot should achieve. Unfortunately, a variety of factors prevent the AIBO from walking precisely. Some can be compensated for, but others cannot. Key factors are:

Tekkotsu provides a walk calibration tool, written in Matlab, that can be used to calculate and apply corrections to the walk parameter file used by WalkMC. A tutorial on the use of this tool is available
here.

Prev: Postures and Motion Sequences
Up: Contents
Next: World State


Last modified: Thu Jun 17 22:39:18 EDT 2010