A self-driving car, to be deployed in real-world driving environments, must be capable of reliably detecting and effectively tracking of moving objects. In this talk, I will briefly present our new, moving object detection and tracking system that extends and improves our earlier system used for the 2007 DARPA Urban Challenge.
First, I will explain our new vision recognition system which detects pedestrians, bicyclists, and vehicles in real-time (at around 8 Hz on 640x480 images) using a method called deformable part-based model (DPM). Secondly, I quickly show a new multi-sensor fusion system for moving object tracking and explain how we improve the performance of the tracking system with the visual information from the vision module. Finally, tracking results on the real sensor data will be presented.
Hyunggi Cho studied electrical engineering and robotics at Yonsei University and CMU, respectively. Now he is pursuing his PhD in ECE at CMU, working with Prof. Vijayakumar and Prof. Rajkumar. His research interests are in computer vision and multi-sensor fusion, with a focus on 3D scene understanding for autonomous driving in urban environments.