Mean Nyström Embeddings for Adaptive Compressive Learning

Antoine CHATALIC
niversity of Genoa, Italy
https://achatali.gitlab.io/

Date(s) : 16/12/2021   iCal
10 h 00 min - 11 h 00 min

Compressive learning is an approach to efficient large scale learning based on sketching an entire dataset to a single mean embedding (the sketch), i.e. a vector of generalized moments. The learning task is then approximately solved as an inverse problem using an adapted parametric model. In this talk, we will first provide a general overview of this learning framework, and then look at more recent results on adaptive compressive learning. Previous works have indeed focused on sketches obtained by averaging random features, that while universal can be poorly adapted to the problem at hand. We propose and study the idea of performing sketching based on data-dependent Nyström approximation. From a theoretical perspective we prove that the excess risk can be controlled under a geometric assumption relating the parametric model used to learn from the sketch and the covariance operator associated to the task at hand. Empirically, we show for k-means clustering and Gaussian modeling that for a fixed sketch size, Nyström sketches indeed outperform those built with random features.

Catégories



Retour en haut 

Secured By miniOrange