Campuses:

Deep Compressed Sensing

Tuesday, October 3, 2017 - 1:25pm - 2:25pm
Lind 305
Paul Hand (Rice University)
Combining principles of compressed sensing with deep neural network-based generative image priors has recently been empirically shown to require 10X fewer measurements than traditional compressed sensing in certain scenarios. As deep generative priors (such as those obtained via generative adversarial training) improve, analogous improvements in the performance of compressed sensing and other inverse problems may be realized across the imaging sciences. In joint work with Vladislav Voroninski, we provide a theoretical framework for studying inverse problems subject to deep generative priors. In particular, we prove that with high probability, the non-convex empirical risk objective for enforcing random deep generative priors subject to compressive random linear observations of the last layer of the generator has no spurious local minima, and that for a fixed network depth, these guarantees hold at order-optimal sample complexity.

Bio
Paul Hand is an assistant professor of Computational and Applied Mathematics at Rice University in Houston, Texas. He received his Ph.D. in mathematics from New York University in 2009, after which he was an instructor of Applied Mathematics at MIT. His work focuses on the development and analysis of algorithms for signal recovery problems, with applications in machine learning, computer vision, and imaging.