Journée
"Tenseurs et estimation de matrices de covariance"
27 novembre 2015


Program

13.00 Xavier Luciani Tensor approaches for source separations: from applications to algorithms (and vice versa)
Summary.
[Slides]
Summary
Tensor decompositions are now widely used in signal processing, notably for source separation or mixture identification purpose. In this context, we can generally distinguish two main kinds of problems for which tensor approaches can help. In the first case, the observed signals (the known data) have a natural tensor structure (sometimes providing some data reshaping) and thus can be directly modeled using an appropriate tensor decomposition. Conversely, the second class of problems implies some mathematical transformation of the data (for instance, the computation of higher order statistics) in order to rewrite the problem in a tensorial form. This distinction will be developed in the first part of the talk and illustrated by real life applications from chemistry and telecommunications fields. It will then appear that the Canonical Polyadic Decomposition (CPD) of tensors, also known as PARAllel FACtor decomposition (PARAFAC), plays a central role in most tensor approaches. As a consequence, a great effort has been done to develop efficient CPD algorithms in the last decade and it is still an hot topic. Thus, the second part of the talk will be dedicated to this algorithmic aspect with an overview of the main CPD algorithms.
14.00 Daniel Kressner Low-rank tensor completion by Riemannian optimization
Summary.
[Slides]
Summary
In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low-rank constraint. We survey existing work in this area and discuss algorithms that perform Riemannian optimization techniques on the manifold of tensors of fixed multilinear rank or fixed tensor train rank. Paying particular attention to the efficient implementation, our algorithm scales linearly in the size of the tensor. Examples with synthetic data demonstrate good recovery even if the vast majority of the entries are unknown. We illustrate the use of the developed algorithm for a range of applications, including the recovery of multidimensional images and for the approximation of multivariate functions.
15.00 Break
15.30 Guillaume Rabusseau Nonparametric Reduced-Rank Regression with Tensor-Structured Response
Summary.
[Slides]
Summary
Extending univariate and multivariate methods to tensor-structured data is a challenging task which has recently received a growing interest. We propose a nonparametric reduced-rank regression method adapted to tensor-structured output data. We first generalise reduced-rank regression from vector to tensor-structured response variable. We then develop a novel nonparametric tensor approach relying on the notion of tensor-valued reproducing kernels, leading to a rank penalized tensor regression problem for which we propose two learning algorithms.
16.00 Ahmad Karfoul Structured Tensor Decomposition for Brain Source Imaging
Summary.
[Slides]
Summary
This talk addresses the localization and reconstruction of spatially distributed sources from ElectroEncephaloGraphic (EEG) signals. The occurrence of several amplitude modulated spikes originating from the same epileptic region are used to build a space-time-spike tensor from the EEG data. A Canonical Polyadic (CP) decomposition of this tensor is computed such that the column vectors of its spatial loading matrix are linear combinations of a physics-driven dictionary. A performance study on realistic simulated EEG data of the proposed tensor-based approach is provided.
17:00 Discussions and conclusion