Deep neural networks and variational inequalities

Carte non disponible

Date/heure
Date(s) - 21/06/2019
14 h 00 min - 15 h 00 min

Catégories Pas de Catégories


Motivated by structures that appear in deep neural networks, we investigate nonlinear composite models alternating proximity and affine operators defined on different spaces. We first show that a wide range of activation operators used in neural networks are actually proximity operators. We then establish conditions for the averagedness of the proposed composite constructs and investigate their asymptotic properties. It is shown that the limit of the resulting process solves a variational inequality which, in general, does not derive from a minimization problem. The analysis relies on tools from monotone operator theory and sheds some light on the asymptotic properties of a class of neural networks structures.

http://pesquet.info

Olivier CHABROL
Posts created 14

Articles similaires

Commencez à saisir votre recherche ci-dessus et pressez Entrée pour rechercher. ESC pour annuler.

Retour en haut
Secured By miniOrange