Generic acceleration schemes for gradient-based optimization
Date(s) : 15/06/2018 iCal
14h00 - 15h00
In this talk, we present generic techniques to accelerate gradient-based optimization algorithms. These approaches build upon the inexact proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary subproblems, leading to faster convergence for the original problem. We introduce two variants based on Nesterov’s acceleration and Quasi-Newton principles, respectively. One of the key to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the right stopping criterion and the right warm-start strategy.
http://lear.inrialpes.fr/people/mairal/
Catégories Pas de Catégories