Date(s) : 04/07/2013 iCal
14 h 00 min - 15 h 00 min
Spectral Learning of Hidden Structure or “How NLP (Natural Language Processing) is just a matter of compressive sensing.”\nBy Raphaël Bailly\, Universitat Politècnica de Catalunya.\n\nThe spectral algorithm\, as it has been introduced\, deals with sequences of observables symbols with a hidden sequence of states. In this talk\, we consider a supplementary hidden layer – e.g. a hidden tree structure over a sequence\, or a link structure between an input and an output sequence. We propose a framework to apply the spectral method to this kind of problem\, even if the Hankel matrix is unknown. Each observed statistic (marginalization over all possible hidden structures for a given sequence) can be turned into a linear constraint over the statistics concerning the joint probabilities\, which are the entries of the Hankel matrix. This leads to an optimization problem of rank minimization over linear constraints\, which can be efficiently solved thanks to the nuclear norm convex relaxation of the rank. This framework applies to all kinds of hidden structures\, provided that there exists efficient ways of marginalizing over the hidden layer\, such as forward-backward or inside-outside algorithms. We will study in details the case of the general transducers\, i.e. with possible empty observations: input and ouput can be generated jointly or independently\, thus inducing a hidden structure over pairs of sequences.