BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Europe/Paris
X-WR-TIMEZONE:Europe/Paris
BEGIN:VEVENT
UID:6171@i2m.univ-amu.fr
DTSTART;TZID=Europe/Paris:20220131T140000
DTEND;TZID=Europe/Paris:20220131T140000
DTSTAMP:20241120T200927Z
URL:https://www.i2m.univ-amu.fr/evenements/asymptotic-optimality-of-condit
 ioned-sgd/
SUMMARY:François Portier (ENSAI\, CREST): Asymptotic optimality of conditi
 oned SGD
DESCRIPTION:François Portier: Visio Zoom - link TBP\n\n&nbsp\;\nWe investi
 gate a general class of stochastic gradient descent (SGD) algorithms\, cal
 led conditioned SGD\, based on a preconditioning of the gradient direction
 . Under some mild assumptions\, we establish the almost sure convergence a
 nd the asymptotic normality for a broad class of conditioning matrices. In
  particular\, when the conditioning matrix is an estimate of the inverse H
 essian at the optimal point\, the algorithm is proved to be asymptotically
  optimal. The benefits of this approach are validated on simulated and rea
 l datasets. As an extension of the Conditioned SGD framework\, we shall te
 rminate by presenting a new class of methods in which\, at each iteration\
 , a single descent direction is selected at random.\n&nbsp\;\n\nThis is jo
 int work with Rémi Leluc (https://remileluc.github.io/)\n\n&nbsp\;\n\nLin
 k to the paper: https://arxiv.org/abs/2006.02745
ATTACH;FMTTYPE=image/jpeg:https://www.i2m.univ-amu.fr/wp-content/uploads/2
 021/11/Francois_Portier.jpg
CATEGORIES:Séminaire,Hybrid,Statistique
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:STANDARD
DTSTART:20211031T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
END:VTIMEZONE
END:VCALENDAR