Deep Regression on Manifolds: A 3D Rotation Case Study

Authors:

Romain Brégier

Abstract:

Many machine learning problems involve regressing variables on a non-Euclidean manifold -- e.g. a discrete probability distribution, or the 6D pose of an object. An approach to tackle these problems through gradient-based learning consists in using a differentiable function mapping arbitrary inputs of a Euclidean space onto the manifold. In this work, we establish a set of desirable properties for such mapping, and in particular exhibit the importance of pre-images connectivity/convexity. We illustrate it with a case study regarding 3D rotations. Through theoretical considerations and methodological experiments on a variety of tasks, we review various differentiable mappings on the 3D rotation space, and conjecture about the importance of their local linearity. We show that a mapping based on Procrustes orthonormalization generally performs best among the ones considered, but that a rotation vector representation might also be suitable when restricted to small angles.

PDF (protected)


  Important Dates

All deadlines are 23:59 Pacific Time (PT). No extensions will be granted.

Paper registration July 23 30, 2021
Paper submission July 30, 2021
Supplementary August 8, 2021
Tutorial submission August 15, 2021
Tutorial notification August 31, 2021
Rebuttal period September 16-22, 2021
Paper notification October 1, 2021
Camera ready October 15, 2021
Demo submission July 30 Nov 15, 2021
Demo notification Oct 1 Nov 19, 2021
Tutorial November 30, 2021
Main conference December 1-3, 2021

  Sponsors