Patents Pending


A time-of-flight depth camera that works outdoors in bright sunlight.

Imaging Capabilities

Time-of-flight (ToF) depth sensors have become the technology of choice in diverse applications today, from automotive and aviation to robotics, gaming and consumer electronics. These sensors come in two general flavors: LIDAR-based systems that rely on extremely brief pulses of light to sense depth, and continuous-wave systems that emit a modulated light signal over much longer duration. LIDAR can acquire centimeter-accurate depth maps up to a kilometer away in broad daylight but they have low measurement rates and their cost per pixel is orders of magnitude higher than continuous-wave ToF (CW-ToF) devices---whose range, outdoor operation and robustness are extremely limited. Since low cost, large-scale production and high measurement rate often trump other considerations, CW-ToF sensors continue to dominate the consumer electronics and low-end robotics space despite their shortcomings.

The EpiToF is the extension of the Episcan3D to continuous-wave time-of-flight technology. The EpiToF's robustness to different conditions results from its unique imaging capabilites. The sensor can capture images where most light from ambient sources (like the Sun) is blocked out. The EpiToF only captures direct light paths which enables more accurate imaging of scenes with scattered light. The sensor's imaging method also enables interference free imaging with multiple ToF devices and imaging without motion blur.

Our current EpiToF prototype is capable of capturing 320x240 resolution depth images at 7.5 frames per second outdoors in sunlight at ranges up to 15 meters. Increased range and video frame rates are possible with a well engineered system.

Blocking Ambient Light

The depth cameras that are available on the market today only work indoors. Outside, their active light sources are overwhelmed by sunlight. EpiToF uses a unique imaging technique that blocks almost all ambient light. This allows its low power source to compete with the Sun. This opens up all sorts of exciting new applications for depth cameras such as outdoor imaging and self-driving cars.

Direct Light Path Imaging

EpiToF can differentiate between single-bounce light that reflects directly off surfaces and more complex multiple-bounce light paths that involve interreflection and scattering. The presence of complex light paths confuses most conventional 3D scanners, but EpiToF is robust to these types of light paths and produces accurate measurements.

With regular ToF cameras, diffuse interreflections between walls and ceiling cause depths to be overestimated and the corner to be rounded out. With epipolar imaging, the walls appear straight and meet at a sharp right angle. The conference table appears specular at grazing angles, but the EpiToF captures only the first light bounce. In the bathroom scene, the ghosting on the wall due to reflections from the mirror is suppressed by epipolar imaging. The water fountain is particularly challenging because the direct return from its metallic surface is very weak, but the surface reflects a lot of indirect light back to the sensor.

Multi-Camera Imaging

As continuous wave ToF Cameras appear in more and more devices, they must be able to operate without interfering with each other. While devices of a given make and model can be configured to not interfere with each other by varying modulation frequency across them or synchronization, robustness against the broader ecosystem of CW-ToF sensors is desirable. EpiToF enables interference-free live 3D imaging even for devices that have the exact same modulation frequency and light source wavelength.

Multiple Camera Interference

Imaging with Camera Motion

Regular ToF imaging has motion blur and strong artifacts at depth discontinuities when moved during frame capture. With epipolar ToF imaging, motion blur has basically no effect and a depth map with a rolling-shutter-like effect is acquired. This can be corrected with a simple image warp computed from the rotation measured with a MEMS gyroscope.


The EpiToF is a low power time-of-flight sensor that works in bright ambient light. It has potential applications in outdoor sensing, mobile robotics, and self-driving cars.

Outdoor Depth Imaging

Contemporary depth imaging systems such as the Microsoft Kinect, Intel RealSense, and ToF cameras work indoors where there is little ambient light but not outdoors in Sunlight.

EpiToF's energy-efficient, real time depth sensing technology works both indoors and outside and could power the next generation of outdoor imaging systems.

Sensing for Self-Driving Cars

Today's self-driving cars rely heavily on expensive LIDAR units that only produce sparse point clouds. The EpiToF produces denser pointclouds for high resolution object recognition and obstacle avoidance.

How it Works

The Prototype

The EpiToF prototype is built from an off-the-shelf continuous-wave time-of-flight camera and a custom built projector.

  • a continuous wave time-of-flight camera
  • a custom modulated light sheet projector

Performance Specs:

  • Resolution: 320x240
  • FOV: 45°(h) x 35 °(v)
  • Range in sunlight (white scene)*: 15 meters
  • Range in darkness (white scene): 20+ meters
  • Frame Rate: 7.5 Hz
  • *See paper for range details

Our prototype

The Light Sheet Projector

The modulated light sheet projector is comprised of:

  • a 700mW, 658nm collimated laser diode
  • a Powell Lens line generator
  • a galvomirror

The laser is collimated and passes through a Powell Lens line generator to form a light sheet that is then deflected and steered by a galvomirror in sync with the ToF camera. The laser is amplitude modulated to produce a line modulating at a fixed frequency.

Light Sheet Projector

The Working Principle

The EpiToF sensor's light sheet projector illuminates the scene one scanline at a time. The ToF camera and projector are aligned so that by epipolar geometry, one projector scanline corresponds to a single row of pixels in the ToF camera. The ROI on the ToF camera is selected and synchronized so that the exposed row moves in lockstep with the active projector scanline. At each row the camera captures multiple images of the scene at different phases of the modulated wave. These images are then used to calculate the depth map from the phase shift principle.

Since the projected scanline and exposed camera row are synchronized, the exposure can be very short (~100us). This short exposure integrates very little ambient light while still collecting all the light from the projector. In addition, only light paths that follow the epipolar constraint between the projector and camera reach the camera sensor, which blocks almost all multipath light.

Working Principle

More Details

For an in-depth description of the technology behind the EpiToF sensor, please refer to our paper and the accompanying video

Supreeth Achar, Joseph R. Bartels, William L. ‘Red’ Whittaker, Kiriakos N. Kutulakos, Srinivasa G. Narasimhan. "Epipolar Time-of-Flight Imaging", ACM SIGGRAPH 2017

Link to paper
Link to paper video


The EpiToF sensor is the result of a collaborative effort between researchers at Carnegie Mellon University and the University of Toronto.

Patents Pending.

Supreeth Achar
Joe Bartels
William "Red" Whittaker
Kyros Kutulakos
Srinivasa Narasimhan


This work is sponsored by the Office of Naval Research (Grant N000141512358, DURIP N000141612906), the National Aeronautics and Space Administration (Grant NNX16AD98G), the Defense Advanced Research Projects Agency (REVEAL Grant HR00111620021) and the Natural Sciences and Engineering Reseach Council of Canada (NSERC) under the RPGIN and SPG programs. J. Bartels was supported by NASA fellowship NNX14AM53H.

Copyright © 2017 Supreeth Achar