15-494/694 Cognitive Robotics Lab 9: Motion Detection
I. Software Update, SDK Update, and Initial Setup
Note: You can do this lab either individually, or in teams of two.
At the beginning of every lab you should update your copy of the
cozmo-tools package. Do this:
$ cd ~/cozmo-tools
$ git pull
II. Motion Detection
The Cozmo SDK posts an event when motion is detected in the camera
image. In cozmo-tools this is turned into an ObservedMotionEvent
object. Run the
Lab9Demo.fsm demo and wave your fingers in front of
the camera to generate some motion detection events.
Now read
the documentation
for the SDK's EvtRobotObservedMotion class to understand what information is available. In cozmo-tools
you can look in event.params for the event parameters.
While running the Lab9Demo behavior, after an event is detected, look
in robot.event.params to see the parameters. Notice that one of the parameters is a timestamp.
III. Event Annotation
Modify cozmo_fsm/program.py to set up a handler for
EvtRobotObservedMotion events and add them to a queue. Each event
should remain in the queue for 2 seconds, based on its timestamp.
Modify the image annotator to draw a circle in the camera image for
each observed motion event in the queue, using the event's parameters
to determine the circle's location. Can you further tailor the circle
using additional event information?
IV. Response to Motion
If Cozmo doesn't see any cubes, he can turn to find some. But which
way should he turn? Write a behavior that uses motion events to
determine which way to turn to find a cube.
Hand In
Hand in the code you wrote above.
|