Campuses:

Nonlinear PDEs and regularization in machine learning

Tuesday, September 17, 2019 - 10:30am - 11:30am
Lind 305
Jeff Calder (University of Minnesota, Twin Cities)
This talk will focus on recent connections between nonlinear PDEs and regularization in machine learning. First, we will consider graph-based semi-supervised learning, where graph Laplacian regularization is widely used. In the limit of vanishingly few labels, Laplacian learning is ill-posed, returning nearly the same label for all data points. We will present new models for regularization in graph based learning that are provably well-posed with very few labels, and will show how rigorous analysis of the corresponding PDE limits can lead to new insights and fast computational algorithms. Second, we will discuss open problems in generalization and adversarial robustness of deep neural networks, both of which are closely related to regularization. Deep neural networks are highly expressive and classical statistical machine learning has been unable to adequately explain why deep learning generalizes well. Deep learning is also susceptible to adversarial attacks, where a hacker can manipulate the network into providing incorrect results by making imperceptible modifications to the data. We connect these two issues with regularization and present some preliminary theoretical results and numerical experiments on Lipschitz regularized deep learning.