Real-time Large Scale Dense RGB-D SLAM with Volumetric Fusion

Download: PDF.

“Real-time Large Scale Dense RGB-D SLAM with Volumetric Fusion” by T. Whelan, M. Kaess, H. Johannsson, M.F. Fallon, J.J. Leonard, and J.B. McDonald. Intl. J. of Robotics Research, IJRR, vol. 34, no. 4-5, Apr. 2015, pp. 598-626.


We present a new simultaneous localization and mapping (SLAM) system capable of producing high-quality globally consistent surface reconstructions over hundreds of meters in real time with only a low-cost commodity RGB-D sensor. By using a fused volumetric surface reconstruction we achieve a much higher quality map over what would be achieved using raw RGB-D point clouds. In this paper we highlight three key techniques associated with applying a volumetric fusion-based mapping system to the SLAM problem in real time. First, the use of a GPU-based 3D cyclical buffer trick to efficiently extend dense every-frame volumetric fusion of depth maps to function over an unbounded spatial region. Second, overcoming camera pose estimation limitations in a wide variety of environments by combining both dense geometric and photometric camera pose constraints. Third, efficiently updating the dense map according to place recognition and subsequent loop closure constraints by the use of an ‘as-rigid-as-possible’ space deformation. We present results on a wide variety of aspects of the system and show through evaluation on de facto standard RGB-D benchmarks that our system performs strongly in terms of trajectory estimation, map quality and computational performance in comparison to other state-of-the-art systems.

Download: PDF.

BibTeX entry:

   author = {T. Whelan and M. Kaess and H. Johannsson and M.F. Fallon and
	J.J. Leonard and J.B. McDonald},
   title = {Real-time Large Scale Dense {RGB-D SLAM} with Volumetric Fusion},
   journal = {Intl. J. of Robotics Research, IJRR},
   volume = {34},
   number = {4-5},
   pages = {598-626},
   month = apr,
   year = {2015}
Last updated: March 21, 2023