Master of Science in Robotics Thesis Talk

  • Remote Access - Zoom
  • Virtual Presentation - ET
Master's Thesis Presentation

Underground Representations for Robot Localization and Mapping

There has been exciting recent progress in using radar as a sensor for robot navigation given its increased robustness to varying environmental conditions. However, within these different radar perception systems, ground penetrating radar (GPR) remains under-explored. By measuring structures beneath the ground, GPR can provide stable features that are less variant to ambient weather, scene, and lighting changes, making it a compelling choice for long-term spatio-temporal mapping.

In this work, we present a set of approaches for robots to naturally reason about subsurface information for robust localization and mapping in unknown environments. First, we propose a novel method for place recognition using GPR measurements for robot localization. We achieve this by horizontally stacking one-dimensional GPR measurements into two-dimensional images. This is followed by a spatial correlation network over learned image features to correct for translational drift. We find that this approach improves GPR-based localization performance compared to engineered heuristics as it can learn distinct features in often noisy and repetitive images. 

Second, we propose a GPR-based simultaneous localization and mapping (SLAM) method that does not require re-visitation by explicitly modeling the relationships among linear features, such as pipes and geologic fractures, commonly found in real-world scenes. We formulate this as an inference over a factor graph to jointly estimate latent Hough line features and robot states. We find that this approach effectively reduces drift perpendicular to line observations and allows us to simultaneously reconstruct the underground environment features and estimate the robot state.

Thesis Committee:
Michael Kaess (Advisor)
Dimitrious Apostolopolous
Paloma Sodhi

Zoom Participation. See announcement.

For More Information, Please Contact: