Compositional and Scalable Object SLAM

Download: PDF.

“Compositional and Scalable Object SLAM” by A. Sharma, W. Dong, and M. Kaess. In Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA, (Xi'an, China), May 2021, pp. 11626-11632.

Abstract

We present a fast, scalable, and accurate Simultaneous Localization and Mapping (SLAM) system that represents indoor scenes as a graph of objects. Leveraging the observation that artificial environments are structured and occupied by recognizable objects, we show that a compositional and scalable object mapping formulation is amenable to a robust SLAM solution for drift-free large-scale indoor reconstruction. To achieve this, we propose a novel semantically assisted data association strategy that results in unambiguous persistent object landmarks and a 2.5D compositional rendering method that enables reliable frame-to-model RGB-D tracking. Consequently, we deliver an optimized online implementation that can run at near frame rate with a single graphics card, and provide a comprehensive evaluation against state-of-the-art baselines. An open-source implementation will be provided at https://github.com/rpl-cmu/object-slam.

Download: PDF.

BibTeX entry:

@inproceedings{Sharma21icra,
   author = {A. Sharma and W. Dong and M. Kaess},
   title = {Compositional and Scalable Object {SLAM}},
   booktitle = {Proc. IEEE Intl. Conf. on Robotics and Automation, ICRA},
   pages = {11626-11632},
   address = {Xi'an, China},
   month = may,
   year = {2021}
}
Last updated: March 21, 2023