A Well-Tempered Landscape for Non-convex Robust Subspace Recovery
We present a mathematical analysis of a gradient descent method for Robust Subspace Recovery. The optimization is cast as a minimization overthe Grassmannian manifold, and gradient steps are taken along geodesics. We show that under a generic condition, the energy landscape is nice enough for the non-convex gradient method to exactly recover an underlying subspace. The condition is shown to hold with high probability for a certain model of data. This work is joint with Tyler Maunu and Teng Zhang.
Lerman received his Ph.D. in Mathematics at Yale University in 2000 under the direction of Ronald Coifman and Peter Jones. His postdoctoral experience included Courant Instructorship (2000-2003) at New York University’s Courant Institute of Mathematical Sciences and training in bioinformatics as a researchscientist in Bud Mishra’s Lab (2003-2004) atthe same institute. He was a recipient of an NSF CAREER award in 2010 and the Feinberg Foundation Visiting Faculty Fellowship at the Weizmann Institute in 2013.
Lerman has extensive experience working with industry both as a consultant and collaborator. His areas of research and expertise include high dimensional data, machinelearning, algorithm design, and mathematical foundations of data analysis. As director of the Data Science Lab, his goal is to provide industry with access to academic research and tools for analysis of data.