Deep Regression on Manifolds: A 3D Rotation Case Study |
---|
Authors: Romain Brégier |
Abstract: Many machine learning problems involve regressing variables on a non-Euclidean manifold -- e.g. a discrete probability distribution, or the 6D pose of an object. An approach to tackle these problems through gradient-based learning consists in using a differentiable function mapping arbitrary inputs of a Euclidean space onto the manifold. In this work, we establish a set of desirable properties for such mapping, and in particular exhibit the importance of pre-images connectivity/convexity. We illustrate it with a case study regarding 3D rotations. Through theoretical considerations and methodological experiments on a variety of tasks, we review various differentiable mappings on the 3D rotation space, and conjecture about the importance of their local linearity. We show that a mapping based on Procrustes orthonormalization generally performs best among the ones considered, but that a rotation vector representation might also be suitable when restricted to small angles. |
PDF (protected) |