Université de Montréal
Date(s) : 29/01/2021 iCal
14 h 30 min - 15 h 30 min
In this talk, I will give an introduction to tensor networks and give a very brief overview of three recent contributions from my group aiming at going beyond classical tensor decomposition models using the tensor network formalism.
Tensors are high order generalization of vectors and matrices. Similar to matrix factorization techniques, one of the goal of tensor decomposition techniques is to express a tensor as a product of small factors, thus reducing the number of parameters and potentially regularizing machine learning models. While linear algebra is ubiquitous and taught in most undergrad curriculum, tensor and multilinear algebra can be daunting. In the first part of the talk, I will try to give an easy and accessible introduction to tensor methods using the tensor network formalism. Tensor networks are an intuitive diagrammatic notation allowing one to easily reason about complex operations on high-order tensors.
In the second part of the talk, I will very briefly give an overview of three recent work from my group, ranging from tensorizing random projections to studying the VC dimension of tensor network models.
Guillaume Rabusseau is an assistant professor at Univeristé de Montréal and holds a Canada CIFAR AI chair at the Mila research institute. Prior to joining Mila, he was an IVADO postdoctoral research fellow in the Reasoning and Learning Lab at McGill University, where he worked with Prakash Panangaden, Joelle Pineau and Doina Precup. He obtained his PhD in computer science in 2016 at Aix-Marseille University under the supervision of François Denis and Hachem Kadri. His research interests lie at the intersection of theoretical computer science and machine learning, and his work revolves around exploring inter-connections between tensors and machine learning to develop efficient learning methods for structured data relying on linear and multilinear algebra.