# Taming Nonconvexity: From Smooth to Nonsmooth Problems and Beyond

Tuesday, September 10, 2019 - 1:25pm - 2:25pm

Lind 305

Ju Sun (University of Minnesota, Twin Cities)

Most applied problems we encounter can be naturally written as nonconvex optimization, for which

obtaining even a local minimizer is computationally hard in theory, never mind the global minimizer. In practice, however, simple numerical methods often work surprisingly well in finding high-quality solutions, e.g., training deep neural networks.

In this talk, I will describe our recent effort in bridging the mysterious theory-practice gap for nonconvex optimization, in the context of solving practical problems in signal processing, machine learning, and scientific imaging. 1) I will highlight a family of smooth nonconvex problems that can be solved to global optimality using simple numerical methods, independent of initialization. 2) The discovery, however, does not cover nonsmooth functions, which are frequently used to encode structural objects (e.g., sparsity) or achieve robustness. I will introduce tools from nonsmooth analysis, and demonstrate how nonsmooth, nonconvex problems can also be analyzed and solved in a provable manner. 3) Toward the end, I will provide examples to show how innovative problem formulation and physical design can help to tame nonconvexity.

Ju Sun is an assistant professor at the computer science & engineering department, University of Minnesota at Twin Cities. Prior to this, he was a postdoctoral scholar at Stanford University,

working with Professor Emmanuel Candѐs. He received his Ph.D. degree from Electrical Engineering of Columbia University in 2016 (2011--2016) and B.Eng. degree in Computer Engineering (with a minor in Mathematics) from the National University of Singapore in 2008 (2004--2008). His research interests span computer vision, machine learning, numerical optimization, signal/image processing, and high-dimensional data analysis. Recently, he is particularly fascinated by why simple numerical methods often solve nonconvex problems surprisingly well (on which he maintains a bibliographic webpage: http://sunju.org/research/nonconvex/ ) and the implication on representation learning. He won the best student paper award from SPARS'15 and honorable mention of doctoral thesis for the New World Mathematics Awards (NWMA) 2017.

obtaining even a local minimizer is computationally hard in theory, never mind the global minimizer. In practice, however, simple numerical methods often work surprisingly well in finding high-quality solutions, e.g., training deep neural networks.

In this talk, I will describe our recent effort in bridging the mysterious theory-practice gap for nonconvex optimization, in the context of solving practical problems in signal processing, machine learning, and scientific imaging. 1) I will highlight a family of smooth nonconvex problems that can be solved to global optimality using simple numerical methods, independent of initialization. 2) The discovery, however, does not cover nonsmooth functions, which are frequently used to encode structural objects (e.g., sparsity) or achieve robustness. I will introduce tools from nonsmooth analysis, and demonstrate how nonsmooth, nonconvex problems can also be analyzed and solved in a provable manner. 3) Toward the end, I will provide examples to show how innovative problem formulation and physical design can help to tame nonconvexity.

Ju Sun is an assistant professor at the computer science & engineering department, University of Minnesota at Twin Cities. Prior to this, he was a postdoctoral scholar at Stanford University,

working with Professor Emmanuel Candѐs. He received his Ph.D. degree from Electrical Engineering of Columbia University in 2016 (2011--2016) and B.Eng. degree in Computer Engineering (with a minor in Mathematics) from the National University of Singapore in 2008 (2004--2008). His research interests span computer vision, machine learning, numerical optimization, signal/image processing, and high-dimensional data analysis. Recently, he is particularly fascinated by why simple numerical methods often solve nonconvex problems surprisingly well (on which he maintains a bibliographic webpage: http://sunju.org/research/nonconvex/ ) and the implication on representation learning. He won the best student paper award from SPARS'15 and honorable mention of doctoral thesis for the New World Mathematics Awards (NWMA) 2017.