Date(s) - 04/11/2019
14 h 00 min - 16 h 00 min
Catégories Pas de Catégories
Badr-eddine CHÉRIEF-ABDELLATIF (CREST-ENSAE, Paris)
Bayesian inference provides an attractive learning framework to analyze and to sequentially update knowledge on streaming data, but is rarely computationally feasible in practice. In the recent years, variational inference (VI) has become more and more popular for approximating intractable posterior distributions in Bayesian statistics and machine learning. Nevertheless, despite promising results in real-life applications, only little attention has been put in the literature towards the theoretical properties of VI. In this talk, we aim to present some recent advances in theory of VI. We will show that variational inference is consistent under mild conditions and retains the same properties than exact Bayesian inference, and we will present several applications to illustrate our general results. We will also briefly discuss some extensions of these results.