Revisiting Nesterov’s Acceleration

Monday, January 25, 2016 - 3:15pm - 4:05pm
Keller 3-180
Sébastien Bubeck (Microsoft)
I will present a new method for unconstrained optimization of a smooth and strongly convex function, which attains the optimal rate of convergence of Nesterov's accelerated gradient descent. The new algorithm has a simple geometric interpretation, loosely inspired by the ellipsoid method. In practice the new method seems to be superior to Nesterov's accelerated gradient descent.

Joint work with Yin-Tat Lee and Mohit Singh.
MSC Code: