Texture-Illumination Separation for Single-shot Structured Light Reconstruction

Flags Pattern

Active illumination based methods have a trade-off between acquisition time and resolution of the estimated 3D shapes. Multi-shot approaches can generate dense reconstructions but require stationary scenes. Single-shot methods are applicable to dynamic objects but can only estimate sparse reconstructions and are sensitive to surface texture. We present a single-shot approach to produce dense shape reconstructions of highly textured objects illuminated by one or multiple projectors. The key to our approach is an image decomposition scheme that can recover the illumination image of different projectors and the texture images from their mixed appearance caused by the illumination from one projector onto textured surface, illumination from multiple projectors onto a textureless surface, or their combined effect. Our method can accurately compute per-pixel warps from the illumination patterns and the texture template to the observed image. The texture template is obtained by interleaving the projection sequence with an all-white pattern. The estimated warps are reliable even with infrequent interleaved projection and sufficient object deformation. Thus, we obtain detailed shape reconstruction and dense motion tracking of the textured surfaces. The proposed method, implemented on a one camera and two projectors system, is validated on synthetic and real data containing subtle non-rigid surface deformations.

Publications


"Separating Texture and Illumination for Single-Shot Structured Light Reconstruction"
M. Vo, S. G. Narasimhan, and Y. Sheikh,
The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops,
June 2014.
[PDF][PPT]

"Texture Illumination Separation for Single-shot Structured Light Reconstruction"
M. Vo, S. G. Narasimhan, and Y. Sheikh,
Submited to IEEE PAMI

Technical summary

Because of the highly textured surface, applying a spatial decoding of the high frequency light pattern directly on the observed image is not possible

The projector pattern serves as the illumination template. The texture template is obtained by interleaving the projecting sequencce with a white pattern.

Affine warping functions are applied to both the texture and illumination template to synthesize the observed image. Once the mapping from the light pattern in the observed to its template is recovered, the 3D shape is computed from triangulation.

Results: Mixture of texture and one illumination source

(Video Playlist)

The texture-illumination greedy growing process: The propogation uses the estimated affine warping coefficients of previous points as initial guess.

Flower dress sequence with a one projector and one camera setup

Flag shirt sequence with a one projector and one camera setup

Dog cloth sequence with a one projector and one camera setup

Results: Mixture of two illumination sources

(Video Playlist)

Dressed shirt sequence with a two projectors and one camera setup

Results: Mixture of texture and two illumination sources

(Video Playlist)

Cooking glove sequence with a two projectors and one camera setup

Flag shirt sequence with a two projectors and one camera setup

Dog shirt sequence with a two projectors and one camera setup

Texture separation for image tracking with a two projectors and one camera setup

Acknowledgements


This research was supported in parts by an ONR Grant N00014-11-1-0295, a NSF Grant IIS-1317749, and a NSF Grant No. 1353120.