15-494/694 Cognitive Robotics Lab 9:
World Map and Speech Control

I. Software Update and Initial Setup

  1. If you are running on your personal laptop you will need to update your copy of the cozmo-tools package. (The REL workstation copies are kept updated by course staff.) To apply the update, assuming you put the cozmo-tools package in /opt, do this:
    $ cd /opt/cozmo-tools
    $ sudo git pull
  2. For this lab you will need a robot, a charger, a Kindle, and some walls.
  3. Log in to the workstation.
  4. Make a lab9 directory.
  5. Get a wall from the TAs.

II. Detecting Walls

We now have a very early implementation of a world map where walls are inferred from ArUco markers. The particle filter constantly maintains its map of where the markers are. When we need to do path planning, we generate a world map containing walls. The path planner then treats these walls as obstacles.

Walls are specified by a list of marker ids and their distances from the left edge of the wall. There are actually two sets of markers because each side of the wall has its own set. In addition, a wall has a list of doorways. Each doorway is specified by a distance value (the distance from the left edge of the wall to the middle of the doorway) and a width.

The file cozmo_fsm/world_map.py contains code for specifying walls and constructing a world map. Two walls are built-in. One is a regular wall with two doorways; the other is a wall using the 0 and 1 ArUco markers from the paper sheets we used a few weeks ago, with one doorway.

Download the file Lab9a.fsm and examine it. This progam plans a path from the robot's starting location to a distant goal. But first it lets the particle filter look for landmarks. Unfortunately, we have not yet integrated the particle viewer with the path viewer, so you cannot use the particle viewer to drive the robot around. However, you can move the wall relative to the robot and see how the path planner reacts.

  1. Pick a wall and measure the positions of the markers and doorways.

  2. Modify Lab9a to add information about your wall to the wall dictionary. (See the definition of make_walls in world_map.py for guidance. You can modify cozmo_fsm.world_map.wall_marker_dict from within your own program.)

  3. Modify the Lab9a code to set goal location on the other side of the wall. Try different locations and see what happens.

  4. What happens if you a set a goal location too close to the wall?

  5. Can you get the path planner to plan a path through a doorway instead of around the wall?
Please remind your TAs to take some pictures of you working on this part of the lab. (The folks at Anki are curious about what you're up to.)

III. Speech-Based Interaction

  1. Download the file Lab9b.fsm and have a look at it. This file is a version of the SpeechTest demo that was shown in class. It uses word and phrase dictionaries defined in cozmo_fsm/speech.py, plus Google's speech recognition service.

  2. Run the demo and verify that it works. We keep a headset in the cabinet with the robots, and two more headsets are on order.

  3. Write your own speech-based application that implements the following commands. Note: to change an object's color you will have to use a Cozmo SDK call like cozmo.objects.LightCube.set_lights(). Battery voltage is available as robot.battery_voltage.

    • Cozmo, color yourself red.
    • Cozmo, please color cube2 blue.
    • Cozmo, please color the paperclip green.
    • Cozmo, turn toward cube3.
    • Cozmo, what is your battery voltage?

Hand In

Hand in the code you wrote in parts II and III.

Dave Touretzky
Last modified: Fri Mar 24 18:16:36 EDT 2017