Applying Subgradient Methods -- and Accelerated Gradient Methods -- to Efficiently Solve General Convex, Conic Optimization Problems

Tuesday, January 26, 2016 - 9:00am - 9:50am
Keller 3-180
James Renegar (Cornell University)
Recently we introduced a framework for applying subgradient methods to solve general convex, conic optimization problems. The framework, once seen, is obvious, but had not appeared in the literature, a blind spot. Quite recently we posted a refinement of the framework in the special case of hyperbolic programming. Hyperbolicity cones have algebraic structure ideal for smoothing. Once a hyperbolic program is smoothed, virtually any accelerated method can be applied, which if done with care, results in a first-order algorithm with best-possible iteration bound.

We provide an overview of these developments, then briefly discuss where we are now working to deepen and broaden the results, both in the pure theory and with regards to design of algorithms aimed at practice.
MSC Code: