Lundi 27 février 14:00-15:00 -
Ilaria GIULINI - INRIA Saclay
Kernel spectral clustering
Résumé : We consider the setting of performing spectral clustering in a Hilbert space. We show how spectral clustering, coupled with some preliminary change of representation in a reproducing kernel Hilbert space, can bring down the representation of classes to a low-dimensional space and we propose a new algorithm for spectral clustering that automatically estimates the number of classes.
Lundi 27 février 15:30-16:30 -
Michael PERROT - Université Jean Monnet (Saint-Étienne)
Learning Metrics with Controlled Behaviour
Résumé : The goal in Machine Learning is to acquire new knowledge from data. To achieve this many algorithms make use of a notion of distance or similarity between examples. A very representative example is the nearest neighbour classifier which is based on the idea that two similar examples should share the same label : it thus critically depends on the notion of metric considered. Depending on the task at hand these metrics should have different properties but manually choosing an adapted comparison function can be tedious and difficult. The idea behind Metric Learning is to automatically tailor such metrics to the problem at hand. One of the main limitation of standard methods is that the control over the behaviour of the learned metrics is often limited. In this talk I will present two approaches specifically designed to overcome this problem. In the first one we consider a general framework able to take into account a reference metric acting as a guide for the learned metric. We are then interested in theoretically studying the interest of using such side information. In the second approach we propose to control the underlying transformation of the learned metric. Specifically we use some recent advances in the field of Optimal Transport to force it to follow a particular geometrical transformation.