Localization and Mapping using Bearings Only Sensors

Matthew Deans and Martial Hebert


In this work we seek to produce a map of the environment concurrently with tracking the pose of the robot in the estimated map. The scenario under consideration is one where the robot senses its own motion through odometry or inertial sensors and senses the environment via a bearings only sensor, e.g. a camera. The map produced is a sparse, feature based, metric map.

We would like for the mapping algorithm to scale well to large or cluttered environments where there are a large number of landmark features. Robustness to outliers in motion and landmark bearing measurements is required. Furthermore, we would like to calibrate the system online, so that if significant biases exist in the robot motion model or in the sensing model, these can be detected and removed.


Many real world robotics applications require significant and often costly alteration to the environment so that the mobile robot can localize. Others require a costly surveying and mapping effort, typically done manually, in order to provide a useful map for the robot. Dynamic environments, for example strip mines, require this mapping procedure to be cheap, easy, and fast. Remote or hazardous environments, such as space robotics or hazmat cleanup tasks, require the procedure to be automated.

State of the Art:

The problem of producing maps using bearing and range sensors such as sonar, laser, or millimeter wave radar have received much attention. Some of these methods require highly accurate egomotion estimates. These methods generally do not extend easily to bearing only sensors. The Structure from Motion problem in Computer Vision is similar to the bearings only mapping problem and offers some insights into the solution, but many of these methods do not scale well. Few current mapping or SFM algorithms allow for online calibration.


The basic approach is to consider observations taken over short, contiguous blocks of time and solve a full nonlinear optimization problem over that set of observations. The solution is a set of robot and landmark positions, or a submap. From the submap, landmark relationships are extracted which are invariant under rigid transformations. These features capture some sufficient statistics for the data observed in each time block.

Estimates of landmark positions in a global reference frame are influenced by errors in the robot pose. This is because the estimate of a landmark position is essentially computed by measuring the landmark position in rover coordinates and then transforming the landmark position to the world coordinate frame via the uncertain robot pose. Estimates of invariant features are not influenced by vehicle pose or pose uncertainty since the relationships are measured independently of where the robot is located.

The invariant relationships are recursively estimated as more batches of observations are processed. These relationships are used in a separate optimization which finds a global map that is most consistent with all of the invariant relationships from all of the submaps.

Future Work:

We are currently extending the method in two ways. First, the use of regression diagnostics is being investigated for the detection and removal of outliers and bias in vehicle motion and landmark bearing measurements. Secondly, we are looking at ways to solve the data association problem in order to match observed landmarks in a submap with landmarks which are in the current global map without requiring appearance based matching or accurate alignment between the submap and global map. This will allow a robot whose absolute position estimate has drifted significantly to detect that it has entered a region which it has visited before.

Figure 1: Robot trajectory estimated with dead reckoning only. The robot actually drove in a closed elliptical path but bias in the motion model produces a spiral shaped estimate.
\center{\epsfig{figure=dead_reckon.eps,width=3in} }

Figure 2: Robot trajectory and landmark positions estimated with odometry and landmark bearings combined. The closed loop shape of the true trajectory is recovered.
\center{\epsfig{figure=odometry_and_bearings.eps,width=3in} }


Matthew Deans and Martial Hebert.
Invariant filtering for simultaneous localization and mapping.
In Proceedings of ICRA 2000, April 2000.

About this document ...

Localization and Mapping using Bearings Only Sensors

This document was generated using the LaTeX2HTML translator Version 98.1p1 release (March 2nd, 1998)

Copyright © 1993, 1994, 1995, 1996, 1997, Nikos Drakos, Computer Based Learning Unit, University of Leeds.

The command line arguments were:
latex2html -debug -white -no_fork -no_navigation -address -split 0 -dir html -external_file deans deans.tex.

The translation was initiated by Daniel Nikovski on 2000-04-28