Event Guided Depth Sensing


Manasi Muglikar, Diederik Paul Moeys and Davide Scaramuzza


Active depth systems like traditional and event-based Structured Light (SL) and Light Detection And Ranging (LiDAR) systems sample the depth of the entire scene at a fixed scan rate. This leads to limited spatio-temporal resolution where redundant static information is over-sampled and precious motion information might be under-sampled. We take inspiration from human perception, which involves scanning areas of interest with the highest resolution while other regions are sampled sparsely. In this paper, we present an efficient bio-inspired event-camera-driven depth estimation algorithm. In our approach, we dynamically illuminate areas of interest densely, depending on the scene activity detected by event camera, and under-sample areas n the field of view with no motion. Thus with our setup, we only need to scan small regions densely. This can potentially reduce power consumption and increase the depth scanning frequency. The depth estimation is achieved by a point laser directed at areas of interest, coupled with a second event-based sensor tuned to detect its pulses. We show the feasibility of our approach in a simulated autonomous driving scenario and real indoor sequences using our prototype.

PDF (protected)

  Important Dates

All deadlines are 23:59 Pacific Time (PT). No extensions will be granted.

Paper registration July 23 30, 2021
Paper submission July 30, 2021
Supplementary August 8, 2021
Tutorial submission August 15, 2021
Tutorial notification August 31, 2021
Rebuttal period September 16-22, 2021
Paper notification October 1, 2021
Camera ready October 15, 2021
Demo submission July 30 Nov 15, 2021
Demo notification Oct 1 Nov 19, 2021
Tutorial November 30, 2021
Main conference December 1-3, 2021