15-494/694 Cognitive Robotics Lab 7:
Particle SLAM

I. Software Update and Initial Setup

  1. If you are running on your personal laptop you will need to update your copy of the cozmo-tools package. (The REL workstation copies are kept updated by course staff.) To apply the update, assuming you put the cozmo-tools package in /opt, do this:
    $ cd /opt/cozmo-tools
    $ sudo git pull
  2. For this lab you will need a robot, a charger, a Kindle, and some cubes.
  3. Log in to the workstation.
  4. Make a lab7 directory.

II. Landmark Orientation Demo

Download and run Lab7a.py. This demonstrates particle filter localization using only the robot's orientation relative to the cubes it can see. Orientation is defined as the angle between the cube's bearing to the robot and the cube's "north". This is equivalent to having the compass direction to a landmark; it is a bearing relative to North rather than relative to the robot's midline.

Question 1: describe a real-world situation where compass bearing would be the most useful landmark information. Hint: not indoors.

With only a single cube visible straight ahead of the robot, and with the cube's "0 degree" face facing the robot, press 'r' repeatedly to resample the particles. You should see particles distributed along a "bearing line".

Shift the cube left or right and observe how the bearing line changes.

Leave the cube straight ahead of the robot, but rotate it in place by 30 degrees. How does the bearing line change? Question 2: if you hold the 'r' key down long enough, the bearing line formed by the particle population slowly moves away from the landmark. What is causing this?

With two cube landmarks visible simultaneously, what result do you expect?

III. Particle SLAM with Cubes

Download and run Lab7b.py. This is a demonstration of a partial implementation of the SLAM (Simultaneous Localization and Mapping) algorithm, using cubes as landmarks. In this implementation, each particle maintains its own map with an estimate of each landmark position. But unlike in real SLAM algorthm, the landmark positions are updated by simple averaging rather than using an EKF (Extended Kalman Filter), so there are no covariance matrices or error ellipses. This will be fixed soon.

Start out by showing the robot one cube. What happens as you drive it around?

Add a second cube to the environment. How does the map evolve as the robot drives around and looks at the two cubes?

Note: new particle_viewer commands. The wasd keys still move by 10 mm and turn 22.5 degrees; now the WASD keys move by 25 mm and turn by 90 degrees.

Continue the experiment by adding a third cube. How close does the map come to capturing the actual landmark positions?

IV. Particle SLAM with ArUco Markers

Download and run Lab7c.py. This demo performs SLAM using ArUco markers. Press 'e' in the particle viewer to force loading of any visible landmarks after the demo starts.

Set up two pairs of ArUco markers in the pattern we used in previous labs, and see how good a map the robot can construct.

Question 3: write a little function to extract the landmark coordinates your experiment produced (just look in the first particle) and print them out. Report the results.

Hand In

Hand in your answers to question 1-3 and the code you wrote in question 3.

Dave Touretzky
Last modified: Sun Feb 26 06:48:38 EST 2017