Notes from RoboRescue US Open 2005

This year they added two new competitions: Red Arena—advanced mobility and Yellow Arena—full autonomy.  Thanks to Robin getting the tarantula ready on short notice and Anton adding autonomy capabilities to the Pioneer on short notice, we were able to compete in both competitions won both classes. 

 

 

 

Red arena notes:

The robots had to navigate step fields when entering the arena.  In all runs we used the tarantula controlled by the tarantula’s remote control (modified with a longer antennae) with the Toshiba web camera mounted on the tarantula and the Pioneer with a Watchport camera that did not enter the arena but watched from outside.

 

Run 1: Driver: Robin

The tarantula got over the first step field and found a victim.  Half way through, NIST plugged a monitor cable into the controlling laptop to project it overhead which messed up the interface and resolution and cost a lot of time.  Though they gave us several extra minutes, this interrupted the flow of the run.  The Pioneer stopped responding to commands and the webcam stopped updating.  The robot ran out of time on the second step field.

 

Run 2: Driver: Robin

The robot got over 2 step fields including the initial flat step field and a horizontal wall step field to find 2 victims.  The first victim was right inside the arena and the second victim was over the second step field.  There was significant “idle” time while the operator adjusted the pan and tilt of the webcam.

 

Run 3: Driver: Anton

The robot quickly saw the wiggling finger victim right inside the arena.  The coupling on the front treads broke so that they were at a 90 degree offset.  Then the rear legs broke.  The right rear leg got stuck in the flat step field and continued driving broke the gears.

 

General comments:

Perception and sensors were relatively unimportant in this competition.  Practice on the actual step fields probably would have significantly improved our performance.  I didn’t see any teams get very far into the arena (though I wasn’t able to watch most of the teams).  There were lots of split levels and stairs, ramps and step fields.  There was no unobstructed floor.  It was disappointing that the emphasis was completely on the step fields since we did not see how the various platforms performed on other types of terrain.

 

 

Yellow arena notes:

The robot had to be completely autonomous.  They could be reset at the cost of an operator.  Points were awarded only for coming close to a victim.  There was no component of victim identification.  We used the pioneer only with mapping (though mapping was not a component of this competition).

 

Run 1:

The robot traveled down the corridor and “found” a victim.  It then turned around and went back to the start where we blocked it.  It turned around and went back down the corridor and “saw” the victim again.  It turned 90 degrees and crashed into the wall.  We are unsure of what caused this malfunction but believe that cords may have interfered with the laser range finder.  We reset the robot and started it again. It went down the hall past the first victim and started down the curved section where it “found” a second victim.  It interpreted the curved corridor as a dead end and turned back to the main corridor.  Upon reaching the main corridor, it “saw” the first victim, and turned around back to the curved corridor.  In the curved corridor, it ran out of time.

Run 2:

After an initial false start where the laser powered off and the robot ran straight down the arena and smacked into the far wall, which NIST was kind enough to not count as a reset, the robot went down a straight hallway ignoring 2 alcoves with victims.  It turned around and went through a diagonal corridor that was a very tight fit.  It reached an open area and “found” a victim.  It then proceeded to loop around in a figure 8 for over 5 minutes.

 

Run 3:

The robot started at the same location as the first run, drove down the corridor, through the curved corridor to another hallway.  It turned left down the hallway, drove to the end and turned around.  It then turned into one of the alcoves and “found” a victim and then reentered the main hallway where it paced back and forth until the end of mission.

 

Comments:

The pioneer performed very well in autonomous mode with the exception of a few times when it ran into the wall.  We are not using the gyroscope on the Pioneer (I don’t think).  Mike put duct tape on the wheels which enabled the robot to handle carpet.  Anton implemented the autonomous mode based on actions from the Pioneer software.  He experimented with different parameters in the different runs.

 

 

Day 2: Orange Arena

All teams had 10 minutes to run this year (compared to 20 minutes last year).   The arena consisted of definite paths with open floor (carpet or tarp), step fields and stairs (red arena) and autonomy only sections (yellow arena).  In contrast to last year, lighting was not a problem.  Flashlights were not necessary and even the PER cameras functioned well.

 

Run 1: Driver: Anton

We started with 2 PERs (Speedy and Pokey) and the Pioneer.  Speedy had the web camera mounted above its pan tilt head for sound and an additional point of view.  Pokey had a watchport camera with streaming video.  We were concerned that the Pioneer would not fit under a bridge at the starting line and so opted to do the autonomy section first.  Unfortunately, we had problems with the laser range finder which turned off so that the robot ran straight down the corridor and crashed into the far wall and was disabled.  The two PERs ventured a bit into the arena but their video was unresponsive and the operator was unable to control them using command mode.  No victims were found.

 

Run 2: Driver: Anton

We started with 2 PERs (Speedy and Pokey) and the Pioneer (same configuration of hardware as in round 1).  We were started at a different location in the arena.  We were still having difficulties with the laser range finder and so opted to control the Pioneer through Mobile Eyes (software that came with the Pioneer) and use streaming video on the Watchport and IR camera.  The Pioneer found 2 victims before the laptop controlling it powered down (about 5 minutes into the competition).  The operator proceeded to direct the other robots into the arena without finding any more victims.

 

Run 3: Driver: Anton

We started with 1 PER (Speedy), the tarantula, and the pioneer.  Speedy had a Watchport camera for streaming video.  The tarantula had the webcam mounted on its rear and was driven backwards since the front legs were broken.  The pioneer was outfitted as in run 2, controlled via mobile eyes with streaming IR and video from the Watchport.  The tarantula went into the arena first and quickly found a victim.  The PER followed to help guide the tarantula with an external point of view.  The Pioneer detected heat  and sound from an invisible victim right inside the arena but it did not count since it was on a raised step and the robot was supposed to “attain the level” of the victim.  The Pioneer tried to go under the bridge but the Watchport camera was too high and the robot got stuck and was disabled.  The PER and tarantula explored the arena together and with 40 seconds to go, the tarantula drove around and found a second victim.

 

Finals:

There were 6 teams entered in the search and rescue league.  Based on results from the first day, 4 teams qualified for the finals.  RAPTOR was in the lead on the first day with 11 points followed by NIIT Blue with 7 and Red Knight and University of Manitoba with 2-3 points each.  There were two rounds in the finals and each run lasted 15 minutes. 

 

Run 1: Driver: Anton

We used the Pioneer, a PER (Speedy), and a mostly working tarantula we borrowed from NIST.  The tarantula had the Toshiba webcam mounted on it.  The PER had a Watchport.  The robots were started facing away from the arena and looking at a wall.  The space was very constricted.  We had significant difficulties getting the Pioneer started since the USB to serial converter that controlled the robot was disconnected when we moved the robot to the arena.  In the process of fixing that, we accidentally hit the power switch on the laser range finder.  We finally debugged that and hurriedly started the robots.  Unfortunately we forgot to turn on the tarantula.  The pioneer drove into the arena but the laser range finder powered off so it could not get range data for feedback nor build a map.  There was little open space and the robot got stuck on a step field.  The driver attempted to bring the PER in to provide an extra point of view but it got tangled in the tarantula’s treads.  We turned on the tarantula and the driver eventually moved it out of the PER’s way.  The PER either hit an obstacle or the tarantula and tipped over onto its back.  The tarantula was unable to get through the corridor since the PER was in the way.  The driver called end of mission without finding any victims.

 

Run 2: Driver Anton:

We used the same configuration of robots as in round 2.  We had fixed the front left steering servo on the PER but apparently incorrectly because it was at a 90 degree angle when it should have been pointing straight.  We sent the Pioneer in first.  The pioneer found 2 victims (one an IR signature and one a waving arm) while being expertly driven and logging data to build a map of the environment for later.  The driver then sent in the tarantula and attempted to climb a set of stairs but the tarantula proved to be damaged and unable to complete the task.  The driver sent the Pioneer through the autonomous section where it passed a victim that it had already identified.  It came out the other side with a bit of difficulty and continued to explore.  It nearly found another victim before the end of mission was called.

 

Lessons learned:

  • Systems engineering is important
  • Practice is important
  • Autonomy: should add human detection code
  • Robustness is important