Information-Theoretic Online Multi-Camera Extrinsic Calibration

Download: PDF.

“Information-Theoretic Online Multi-Camera Extrinsic Calibration” by E. Dexheimer, P. Peluse, J. Chen, J. Pritts, and M. Kaess. IEEE Robotics and Automation Letters, RA-L, vol. 7, no. 2, Apr. 2022, pp. 4757-4764. Presented at ICRA 2022.


Calibration of multi-camera systems is essential for lifelong use of vision-based headsets and autonomous robots. In this work, we present an information-based framework for online extrinsic calibration of multi-camera systems. While previous work largely focuses on monocular, stereo, or strictly non-overlapping field-of-view (FoV) setups, we allow arbitrary configurations while also exploiting overlapping pairwise FoV when possible. In order to efficiently solve for the extrinsic calibration parameters, which increase linearly with the number of cameras, we propose a novel entropy-based keyframe measure and bound the backend optimization complexity by selecting informative motion segments that minimize the maximum entropy across all extrinsic parameter partitions. We validate the pipeline on three distinct platforms to demonstrate the generality of the method for resolving the extrinsics and performing downstream tasks. Our code is available at

Download: PDF.

BibTeX entry:

   author = {E. Dexheimer and P. Peluse and J. Chen and J. Pritts and M.
   title = {Information-Theoretic Online Multi-Camera Extrinsic Calibration},
   journal = {IEEE Robotics and Automation Letters, RA-L},
   volume = {7},
   number = {2},
   pages = {4757-4764},
   month = apr,
   year = {2022},
   note = {Presented at ICRA 2022.}
Last updated: March 21, 2023