[Augmented Reality image] Augmented Reality


Introduction

The Augmented Reality for Supervisory Rover Control task has developed technology that facilitates the control of rovers operating in distant environments. The objective is to create an interface that can operate on a stream of images coming from the rover and use maps to help the operator locate the rover.

The approach augments operator displays by overlaying raw rover-acquired imagery with additional information such as topographic landmarks, heading and position estimates, and planned routes. We will overlay this information in such a manner that it appears to "stick" to the terrain, independent of the rover's motion. FY97 project goals are first to demonstrate position estimation from matches in a real environment.

The current deliverables will be tested during the Atacama Desert Trek, tentatively in the period of July 10 to July 20. The imaging equipment and image processing algorithms developed here will be taken to the field and tested both attached to the Nomad rover and during trips to areas with high topographic relief. Results will be available at the end of the Atacama Desert Trek.

A website is maintained with the data (images and maps) produced by the system.


A. Significant Activities and Events During this Period

Since January 1997, the thrust of the project has been to improve the basic tools available in a variety of ways, so that the system can be tested and validated in the upcoming Atacama desert traverse. We have:

We obtained elevation data for Atacama desert, where the upcoming Atacama field trial will be conducted (with Skyline tests during the period of July 10 to July 20). The maps were translated into proper USGS format and manipulated/visualized in the map window. We plan to use the system as support for map/image visualization and control in the Atacama field trial.

The image below is a mosaic constructed from a sequence of images from the Atacama desert area. The skyline is detected by a region growing algorithm which starts at the top of the image and grows the sky region until pixels with high derivatives are found.

We planned to integrate a new method for position estimation during this quarter, but we had to concentrate more effort in the calibration of cameras and optical equipment. The algorithms are now being calibrated and should be tested for the incoming Atacama trial.

B. Plans for the Next Reporting Period

The most important task to be completed is to incorporate and calibrate the algorithm for position estimation using the whole skyline. This task should be completed by the end of May, when we plan to conduct tests with the whole system and prepare it for packaging to the Atacama desert.

Preliminary tests have been conducted with image stabilization and overlaying information in the incoming video stream (information is already overlayed on top of the mosaicing). We plan to finish the implementation of these algorithms and test them with video sequences.

As before, we will continue discussions with JPL Long-Range Science Rover task to support their position estimation needs, and investigate the extent to which the activities in this task can support the needs of the meteorite search task. Conversations with the Atacama Desert Trek have already produced a commitment to test these technologies during that mission.


C. Schedule


D. Concerns/Issues

None.


Prepared by fgcozman@cs.cmu.edu