Campuses:

Implicit Regularization in Nonconvex Statistical Estimation

Tuesday, January 30, 2018 - 1:25pm - 2:25pm
Lind 305
Yuxin Chen (Princeton University)
Recent years have seen a flurry of activity both in theory and practice of nonconvex optimization. Carefully designed nonconvex procedures simultaneously achieve optimal statistical accuracy and computational efficiency for many problems. Due to the highly nonconvex landscape, the state-of-the-art results often require proper regularization procedures (e.g. trimming, projection, or extra penalization) to guarantee fast convergence. For vanilla algorithms, however, the prior theory usually suggests conservative step sizes in order to avoid overshooting.

This talk uncovers a striking phenomenon: even in the absence of explicit regularization, nonconvex gradient descent enforces proper regularization automatically and implicitly under a large family of statistical models. In fact, the vanilla nonconvex procedure follows a trajectory that always falls within a region with nice geometry. This implicit regularization feature allows the algorithm to proceed in a far more aggressive fashion without overshooting, which in turn enables faster convergence. We will discuss several concrete fundamental problems including phase retrieval, matrix completion, blind deconvolution, and recovering structured probability matrices, which might shed light on the effectiveness of nonconvex optimization for solving more general structured recovery problems.

This is joint work with Cong Ma, Kaizheng Wang, and Yuejie Chi.

Bio: Yuxin Chen is currently an assistant professor in the Department of Electrical Engineering at Princeton University. Prior to joining Princeton, he was a postdoctoral scholar in the Department of Statistics at Stanford University, and he completed his Ph.D. in Electrical Engineering at Stanford University. His research interests include high-dimensional data analysis, convex and nonconvex optimization, statistical learning, and information theory.