"Fusion of Omni-directional Sonar and Omni-directional Vision for Environment Recognition of Mobile Robots"
by Teruko YATA, Akihisa OHYA, Shin'ichi YUTA

2000 IEEE International Conference on Robotics and Automation


Abstract

This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and bearing angle of reflecting points, and an omni-directional vision can give bearing angle to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the bearing angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for the environment recognition. We describe the proposed method and an experimental result to show its potential.