Learning negative mixture models by tensor decomposition

Date(s) : 21/03/2014   iCal
14 h 00 min - 15 h 00 min

In this talk, we consider the problem of estimating the parameters of negative mixture models, i.e. mixture models that possibly involve negative weights. We show that every rational probability distributions on strings, a representation which occurs naturally in spectral learning, can be computed by a negative mixture of at most two probabilistic automata (or HMMs). We present a method to estimate the parameters of negative mixture models having a specific tensor structure in their low order observable moments. Building upon a recent paper on tensor decompositions for learning latent variable models, we extend this work to the broader setting of tensors having a symmetric decomposition with positive and negative weights. This extension leads to a generalisation of the tensor power method for complex valued tensors, for which we establish theoretical convergence guarantees. Finally, we show how our approach applies to negative Gaussian mixture models.

Catégories Pas de Catégories

Retour en haut 

Secured By miniOrange