Event Guided Depth Sensing |
---|
Authors: Manasi Muglikar, Diederik Paul Moeys and Davide Scaramuzza |
Abstract: Active depth systems like traditional and event-based Structured Light (SL) and Light Detection And Ranging (LiDAR) systems sample the depth of the entire scene at a fixed scan rate. This leads to limited spatio-temporal resolution where redundant static information is over-sampled and precious motion information might be under-sampled. We take inspiration from human perception, which involves scanning areas of interest with the highest resolution while other regions are sampled sparsely. In this paper, we present an efficient bio-inspired event-camera-driven depth estimation algorithm. In our approach, we dynamically illuminate areas of interest densely, depending on the scene activity detected by event camera, and under-sample areas n the field of view with no motion. Thus with our setup, we only need to scan small regions densely. This can potentially reduce power consumption and increase the depth scanning frequency. The depth estimation is achieved by a point laser directed at areas of interest, coupled with a second event-based sensor tuned to detect its pulses. We show the feasibility of our approach in a simulated autonomous driving scenario and real indoor sequences using our prototype. |
PDF (protected) |