BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Europe/Paris
X-WR-TIMEZONE:Europe/Paris
BEGIN:VEVENT
UID:4053@i2m.univ-amu.fr
DTSTART;TZID=Europe/Paris:20211216T100000
DTEND;TZID=Europe/Paris:20211216T110000
DTSTAMP:20211203T130930Z
URL:https://www.i2m.univ-amu.fr/evenements/mean-nystrom-embeddings-for-ada
 ptive-compressive-learning/
SUMMARY:Antoine CHATALIC (niversity of Genoa\, Italy): Mean Nyström Embedd
 ings for Adaptive Compressive Learning
DESCRIPTION:Antoine CHATALIC: Compressive learning is an approach to effici
 ent large scale learning based on sketching an entire dataset to a single 
 mean embedding (the sketch)\, i.e. a vector of generalized moments. The le
 arning task is then approximately solved as an inverse problem using an ad
 apted parametric model. In this talk\, we will first provide a general ove
 rview of this learning framework\, and then look at more recent results on
  adaptive compressive learning. Previous works have indeed focused on sket
 ches obtained by averaging random features\, that while universal can be p
 oorly adapted to the problem at hand. We propose and study the idea of per
 forming sketching based on data-dependent Nyström approximation. From a t
 heoretical perspective we prove that the excess risk can be controlled und
 er a geometric assumption relating the parametric model used to learn from
  the sketch and the covariance operator associated to the task at hand. Em
 pirically\, we show for k-means clustering and Gaussian modeling that for 
 a fixed sketch size\, Nyström sketches indeed outperform those built with
  random features.
ATTACH;FMTTYPE=image/jpeg:https://www.i2m.univ-amu.fr/wp-content/uploads/2
 021/12/Antoine_Chatalic.png
CATEGORIES:Séminaire,Signal et Apprentissage
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:STANDARD
DTSTART:20211031T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
END:VTIMEZONE
END:VCALENDAR