Generic acceleration schemes for gradient-based optimization

Carte non disponible
Speaker Home page :
Speaker :
Speaker Affiliation :


Date(s) - 15/06/2018
14 h 00 min - 15 h 00 min

Catégories Pas de Catégories

In this talk, we present generic techniques to accelerate gradient-based optimization algorithms. These approaches build upon the inexact proximal point algorithm for minimizing a convex objective function, and consists of approximately solving a sequence of well-chosen auxiliary subproblems, leading to faster convergence for the original problem. We introduce two variants based on Nesterov’s acceleration and Quasi-Newton principles, respectively. One of the key to achieve acceleration in theory and in practice is to solve these sub-problems with appropriate accuracy by using the right stopping criterion and the right warm-start strategy.

Retour en haut 

Secured By miniOrange