Building 3D Mosaics from an Autonomous Underwater Vehicle and 2D Imaging Sonar

Download: PDF.

“Building 3D Mosaics from an Autonomous Underwater Vehicle and 2D Imaging Sonar” by P. Ozog, G. Troni, M. Kaess, R.M. Eustice, and M. Johnson-Roberson. In Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA, (Seattle, WA), May 2015, pp. 1137-1143.

Abstract

This paper reports on a 3D photomosaicing pipeline using data collected from an autonomous underwater vehicle performing simultaneous localization and mapping (SLAM). The pipeline projects and blends 2D imaging sonar data onto a large-scale 3D mesh that is either given a priori or derived from SLAM. Compared to other methods that generate a 2D-only mosaic, our approach produces 3D models that are more structurally representative of the environment being surveyed. Additionally, our system leverages recent work in underwater SLAM using sparse point clouds derived from Doppler velocity log range returns to relax the need for a prior model. We show that the method produces reasonably accurate surface reconstruction and blending consistency, with and without the use of a prior mesh. We experimentally evaluate our approach with a Hovering Autonomous Underwater Vehicle (HAUV) performing inspection of a large underwater ship hull.

Download: PDF.

BibTeX entry:

@inproceedings{Ozog15icra,
   author = {P. Ozog and G. Troni and M. Kaess and R.M. Eustice and M.
	Johnson-Roberson},
   title = {Building {3D} Mosaics from an Autonomous Underwater Vehicle
	and {2D} Imaging Sonar},
   booktitle = {Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA},
   pages = {1137-1143},
   address = {Seattle, WA},
   month = may,
   year = {2015}
}
Last updated: March 21, 2023