HomeNewsListPh.D candidate Qing Qu from Columbia University visited on December 30, 2015

Ph.D candidate Qing Qu from Columbia University visited on December 30, 2015

Ph.D candidate Qing Qu from Columbia University visited our lab on December 30, 2015. He gave a talk on "When are nonconvex problems not scary?" in Room 9-206, Rohm Building at 10AM.

Qing Qu is a Ph.D student in Electrical Engineering, Columbia University, working with Prof. John Wright. He received a B.Eng. in Electronic Engineering from Tsinghua University, Beijing, in Jul. 2011, and a M.Sc. in Electrical Engineering from the Johns Hopkins University, Baltimore, in Dec. 2012. He interned at U.S. Army Research Laboratory from Jun. 2012 to Aug. 2013. His research interest lies at the intersection of machine learning, numerical optimization, signal/image processing, information theory and compressed sensing, where his current research focuses on developing practical algorithms and provable guarantees for many nonconvex problems with low intrinsic structure. He is the recipient of spotlight talk award at NY Machine Learning Symposium, Best Student Paper Award at SPARS’15 (with Ju Sun, John Wright), and currently one of the finalists of 2016-18 Microsoft Research Fellowship in North America.

 

Title:

When are nonconvex problems not scary?

Abstract:

The algorithmic difficulties for nonconvex optimization arise from a very good reason: they are NP-hard.  However, for many problems of interest in machine learning and signal processing, simple nonconvex heuristics are often surprisingly effective albeit their mystery. Towards understanding nonconvex optimization, this talk will describe a family of nonconvex problems can be solved efficiently, which has benign high dimensional geometries that every local minimizer recovers the object of interest, and all other critical points are ``strict'' saddle. Natural formulations of a number of important problems lie in this family, including the eigenvalue problem, complete dictionary learning (DL), generalized phase retrieval (PR), and orthogonal tensor decomposition, etc.  The nice structure allows design of second-order algorithms exploiting the function geometry, to escape saddle points and provably return the true solution. The combination of geometric characterization of function landscape and second-order algorithms has lead to new computational guarantees for several important nonconvex problems, including DL and PR. As an ongoing effort for enriching this line of research, the talk ends by highlighting current challenges for nonconvex optimization from both theoretical and algorithmic sides.