Z.Yin and R.Collins, "On-the-fly Object Modeling while Tracking," IEEE Conference on Computer Vision and Pattern Recognition (CVPR'07), Minneapolis, MN, June 2007


To implement a persistent tracker, we build a set of view-dependent object appearance models adaptively and automatically while tracking an object under different viewing angles. This collection of acquired models is indexed with respect to the view sphere. The acquired models aid recovery from tracking failure due to occlusion and changing view angle. In this paper, view-dependent object appearance is represented by intensity patches around detected Harris corners. The intensity patches from a model are matched to the current frame by solving a bipartite linear assignment problem with outlier exclusion and missed inlier recovery. Based on these reliable matches, the change in object rotation, translation and scale is estimated between consecutive frames using Procrustes analysis. The experimental results show good performance using a collection of view-specific patch-based models for detection and tracking of vehicles in low-resolution airborne video. [PDF]

Learn the patch mode while tracking (demo videos):