Covariance matrices and covariance operators in machine learning and pattern recognition: A geometrical framework

Date(s) : 24/11/2017   iCal
10 h 00 min - 11 h 00 min

Symmetric positive definite (SPD) matrices, in particular covariance matrices, play important roles in many areas of mathematics and statistics, with numerous applications in various different fields, including machine learning, brain imaging, and computer vision. The set of SPD matrices is not a subspace of Euclidean space and consequently algorithms utilizing only the Euclidean metric tend to be suboptimal in practice. A lot of recent research has therefore focused on exploiting the intrinsic geometrical structures of SPD matrices, in particular the view of this set as a Riemannian manifold. In this talk, we will present a survey of some of the recent developments in the generalization of finite-dimensional covariance matrices to infinite-dimensional covariance operators via kernel methods, along with the corresponding geometrical structures. This direction exploits the power of kernel methods from machine learning in the framework of Riemannian geometry, both mathematically and algorithmically. The theoretical formulation will be illustrated with applications in computer vision, which demonstrate both the power of kernel covariance operators as well as of the algorithms based on their intrinsic geometry.

Catégories Pas de Catégories

Retour en haut