The input to this system consists of a sequence of image from a color camera, digitized at 15 Hz. RACCOON examines a region of interest in each image surrounding the expected location of the lead vehicle. Pixels in this area are thresholded for absolute brightness and redness (to eliminate spurious reflections and headlights). Since taillights vary tremendously from car to car, and also over time (as brake lights and turn signals are illuminated), a model based approach is rejected in favor of a simple bounding box.
The position and size of this box can be used to extract the relative position of the lead vehicle with respect to the camera. Since the road cannot be assumed to be flat, the vertical position of the lights is not a good indicator of the distance to the car. However the horizontal position of the lights reflect its bearing. Since brake lights can change the vertical size of the bounding box, that measurement is error prone. By contrast the horizontal extent of the box is inversely proportional to the range of the vehicle. The effects of foreshortening due to lead vehicle yaw are small enough to be ignored for typical driving situations. The equations for converting these simple image measurements into relative position are straightforward and computationally efficient, allowing us to process images very quickly. Although nighttime scenes are ideal, this algorithm also works during the day if the lead vehicle illuminates its taillights. If desired bright decals or infra-red light sources can be substituted for taillights without modification to the algorithm.
Given the position of the lead vehicle, the straightforward approach to car following is to steer the autonomous vehicle so that it heads towards the taillights of the lead vehicle. Speed can be controlled so that the robot vehicle remains a constant distance behind the lead car. This naive implementation may produce satisfactory results on straight roads when both vehicles are moving at the same speed; however it fails in any realistic scenario since lead vehicles change speed and make turns to follow winding roads, and steering towards taillights results in corner cutting --- possibly causing an accident as the computer controlled vehicle drifts into oncoming traffic or off the road entirely.
RACCOON solves these problems by creating an intermediate map structure which records the lead vehicle's trajectory. The path is represented by points in a global reference frame, and the computer controlled vehicle is steered from point to point. The autonomous vehicle follows this trail while keeping the lead vehicle's taillights in sight. Since every point on the trail is guaranteed to be on the road, the robot vehicle navigates around corners and obstacles rather than through them. A second important advantage is that the autonomous vehicle is not constrained to follow at a constant distance, but may instead follow at its own pace. By changing the problem from ``car following'' to ``path tracking'', the system is able to drive competently in real situations.
RACCOON is implemented as a module which allows easy integration with existing autonomous driving systems. In particular, it can complement road followers in situations where they get confused. Other applications for RACCOON include convoy following and intelligent cruise control.