Generic acceleration schemes for gradient-based optimization

Date(s) : 15/06/2018   iCal
14 h 00 min - 15 h 00 min

In this talk, we present generic techniques to accelerate gradient-based optimization algorithms. These approaches build upon the inexact proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary subproblems, leading to faster convergence for the original problem. We introduce two variants based on Nesterov’s acceleration and Quasi-Newton principles, respectively. One of the key to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the right stopping criterion and the right warm-start strategy.

Catégories Pas de Catégories

Retour en haut 

Secured By miniOrange