Vision through Fog and Haze

Conventional vision systems are designed to perform in clear weather. Needless to say, in any outdoor application, there is no escape from "bad" weather. Images taken in poor weather conditions suffer from severe color and contrast degradation. Furthermore, this degradation worsens exponentially with distance making it impossible to acquire meaningful images of scenes that are not near the imaging system. Thus, computer vision systems must include mechanisms that enable them to function (even if somewhat less reliably) in the presence of haze, fog, rain, hail and snow. In this project, we are studying the visual manifestations of different weather conditions. For this, we draw on what is already known about atmospheric optics, and identify effects caused by bad weather that can be turned to our advantage. Since the atmosphere modulates the information carried from a scene point to the observer, it can be viewed as a mechanism of visual information coding. We exploit two fundamental scattering models, attenuation and airlight, to describe the colors, contrasts and polarizations of scene points observed through bad weather. Then, we use these models to develop methods for recovering pertinent scene properties, such as three-dimensional structure, from one or two images taken under poor weather conditions.


"Interactive (De)weathering of an Image using Physical Models,"
S.G. Narasimhan and S.K. Nayar,
ICCV Workshop on Color and Photometric Methods in Computer Vision (CPMCV),
Oct, 2003.

"Contrast Restoration of Weather Degraded Images,"
S.G. Narasimhan and S.K. Nayar,
IEEE Transactions on Pattern Analysis and Machine Intelligence,
Vol.25, No.6, pp.713-724, Jun, 2003.

"Polarization-Based Vision through Haze,"
Y.Y. Schechner, S.G. Narasimhan and S.K. Nayar,
Applied Optics, Special issue ,
Vol.42, No.3, pp.511-525, Jan, 2003.

"Vision and the Atmosphere,"
S.G. Narasimhan and S.K. Nayar,
International Journal on Computer Vision,
Vol.48, No.3, pp.233-254, Jul, 2002.

"Removing Weather Effects from Monochrome Images,"
S.G. Narasimhan and S.K. Nayar,
IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
Vol.II, pp.186-193, Dec, 2001.

"Instant Dehazing of Images using Polarization,"
Y.Y. Schechner, S.G. Narasimhan and S.K. Nayar,
IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
Vol.I, pp.325-332, Dec, 2001.

"Chromatic Framework for Vision in Bad Weather,"
S.G. Narasimhan and S.K. Nayar,
IEEE Conference on Computer Vision and Pattern Recognition (CVPR),
Vol.1, pp.598-605, Jun, 2000.
[PDF] [Best paper Honorable Mention]

"Vision in Bad Weather,"
S.K. Nayar, S.G. Narasimhan,
IEEE International Conference on Computer Vision (ICCV),
Vol.2, pp.820-827, 1999.


Depth of Light Sources from Attenuation:
This picture shows results of an experiment where two night images of light sources under different weather conditions are used to compute the relative depths of the sources.
Structure from Airlight:
This picture shows results of an experiment where a single image of a scene captured on a very foggy day is used to compute three-dimensional scene structure.
Contrast Restoration:
While airlight increases the apparent brightness of the scene with distance, attenuation decreases the actual scene radiance with distance. We take into account both airlight and attenuation to model the contrast loss in images. This picture shows the results of applying our contrast restoration algorithm to two images taken in different but unknown misty conditions.
Weather and Polarization:
The natural light scattered by atmospheric particles (airlight) is partially polarized. We exploit this fact to remove haze from images. Note that optical filtering alone cannot remove the haze effects, except in very restricted situations. This picture shows how our method uses only two images taken through a polarizer at different orientations to completely remove the effects of haze. This method works instantly, without relying on changes in weather conditions. As a by-product, the method yields a range map of the scene. Two original images used in our experiments can be downloaded here.
Interactive Deweathering:
In this work, we address the question of deweathering a single image using simple additional information provided interactively by the user, such as marking the sky, the depth trends in the scene, etc. Our interactive methods for image (de)weathering can serve as easy-to-use plug-ins for a variety of image processing software.


(Video Result Playlist)
Video Summary:
This video shows several applications of the deweathering models and algorithms we have developed (use Apple Quicktime 6.0).