Lunar Rover Navigation 1997

Lunar Rover Navigation 1997



Nomad

The Robotics Institute at Carnegie Mellon University is developing robotic technologies for planetary exploration as part of its Lunar Rover Initiative. Robotic navigational autonomy is a critical capability for planetary exploration, because the time delay in communication can be many minutes. Navigational autonomy allows a robot to make progress even in the absence of communication with Earth.

[Navigation'97 QuickTime video] - See the Navigation'97 QuickTime video (11 Megabytes)

In 1997, as part of the Atacama Desert Trek field demonstration, the navigation system onboard the Nomad robot drove autonomously through 21 kilometers of the Atacama Desert in northern Chile. The driest desert in the world, the Atacama has thousands of square kilometers of rocky, barren, and varied terrain, which provide the best terrestrial analog of Moon and Mars-like environments.

Our robot navigation system consisted of three primary components: position estimation, obstacle detection, and path planning.

Position Estimation
During the 1997 Atacama Desert Trek, position estimation was provided by differential GPS, gyrocompass, and inclinometer sensors.
Obstacle Detection
Obstacle detection was provided by two pairs of rigidly mounted stereo cameras and the navigation computer. Nomad's stereo vision system had to have its field of view expanded, because Nomad was almost two and a half times the width of Ratler, the prototype on which the stereo vision system was first demonstrated. This was accomplished using wide angle lenses and a new calibration procedure, for which we developed a precisely machined, 6 foot tall calibration cube. These tools resulted in a highly successful deployment. The stereo vision sensors maintained their calibration, ran successfully for up to nine hours each day, and were effectively untouched during the 50 days of field operations.

[Safeguarded Navigation
illustration]

Path Planning
In the context of the complete navigation system, a pair of images taken by the stereo cameras is processed into range data in less than one second. This range data is reprojected into an overhead view, yielding an elevation map of the area in front of the robot; this is combined with previous elevation maps using the position estimate. The new elevation map is then filtered to find all steep slopes, obstacles, dropoffs, and unknown areas. The planner considers all possible curved paths through the map as far as 10 meters ahead, and marks each one as safe or not-safe.

Armed with the knowledge of path safety, Nomad controlled its direction of travel in 3 ways. Nomad always chose the safe path and found obstacles on its own. Unfortunately the software model of the novel steering mechanism was not complete, so sometimes a human operator had to back Nomad away from an obstacle.

  1. Using Obstacle Avoidance, it drove randomly but safely. At one point in the desert, Nomad came across an access road and chose it as the safest route, following the road for nearly 400 meters.
  2. In Safeguarded Teleoperation, a remote driver provided a preferred steering direction at each time step; Nomad still chose only safe directions, but preferred those near the user's direction.
  3. The final, most-used method was Waypoint Navigation. In this mode the user specified a point in the world that could be meters or kilometers away. Nomad drove itself toward the specified point, avoiding obstacles along the way. During the summer of 1997 this world point was entered by typing coordinates on a command line, but future versions of the user interface will allow the user to click on an overhead map view.

The autonomy system was on call throughout the mission, and was typically run on-demand as circumstances warranted, such as during lunch breaks or loss of communications). This graph illustrates the daily total distance traversed in autonomous mode. In summary, Nomad was driven for a total of 21 kilometers of autonomous driving, and 7 kilometers of safeguarded user-driven teleoperation.

As NASA plans its future exploration missions, this capability of tens of kilometers of autonomous traverse will become critical. In the near future, Carnegie Mellon will enhance the navigation system by adapting sensors to extremely cold environments, and will send Nomad and future robots to search for meteorites in Antarctica.

RESEARCHERS
Stereo processing code Martial Hebert
Software networking support Juan León
Linux system administration Paul Levin
Navigation Lead Mark Maimone
Camera enclosures and mounting Eric Rollins
Morphin navigation planner Reid Simmons
Electrical design James Teza
Calibration cube design and manufacture Centre for Metahuman Exploration, Carnegie Mellon
User interface NASA Ames Intelligent Mechanisms Group

Related Links:

The Robotics Institute
Sandia National Labs
LunaCorp
NASA Telerobotics
1997 Atacama Desert Trek Nomad page at NASA Ames
1996 Lunar Rover Navigation

mwm@cs.cmu.edu