Institut de Mathématiques de Marseille, UMR 7373




Rechercher


Accueil >

10 janvier 2019: 4 événements

Séminaire

  • Séminaire Singularités

    Jeudi 10 janvier 11:00-12:00 - Bernd SCHOBER - Leibniz Universität Hannover

    Polyhedral invariants for desingularization

    Résumé : The goal of my talk is to present a convex geometric viewpoint on singularities and their resolution. More presciely, we discuss how the Newton polyhedron and the Hironaka polyhedron of a Weierstrass polynomial provide invariants of the singularity that reflect how ``bad" the singularities are. Here, the Hironaka polyhedron is a certain projection of the Newton polyhedron. After a brief introduction to the notions, we study the behaviour of the Hironaka polyhedron under blowing ups for curves and surfaces. Then we explain how this leads to an invariant for desingularization of surfaces in any characteristic that decreases strictly after blowing up a sufficiently nice center. We focus on the ideas and try to hide the technical details as good as possible.
    This is joint work with Vincent Cossart (Versailles).

    JPEG - 8.2 ko
    Bernd SCHOBER

    Lieu : FRUMAM - Aix-Marseille Université - Site St Charles
    3, place Victor Hugo - case 39
    13331 MARSEILLE Cedex 03

    Exporter cet événement
    Document(s) associé(s) :

    En savoir plus : Séminaire Singularités

  • Séminaire Logique et Interactions

    Jeudi 10 janvier 11:00-12:30 - Dimitri ARA - I2M, Aix-Marseille Université

    Séminaire Logique et Interactions (TBA)

    Résumé : TBA

    JPEG - 8.5 ko
    Dimitri ARA

    Lieu : Salle des séminaires 304-306 (3ème étage) - Institut de Mathématiques de Marseille (UMR 7373)
    Site Sud - Bâtiment TPR2
    Campus de Luminy, Case 907
    13288 MARSEILLE Cedex 9

    Exporter cet événement
    Document(s) associé(s) :

    En savoir plus : Séminaire Logique et Interactions

  • Séminaire Signal et Apprentissage

    Jeudi 10 janvier 14:00-15:00 - Rémi GRIBONVAL - INRIA, Rennes

    Approximation with sparsely connected deep networks

    Résumé : Many of the data analysis and processing pipelines that have been carefully engineered by generations of mathematicians and practitioners can in fact be implemented as deep networks. Allowing the parameters of these networks to be automatically trained (or even randomized) allows to revisit certain classical constructions.
    The talk first describes an empirical approach to approximate a given matrix by a fast linear transform through numerical optimization. The main idea is to write fast linear transforms as products of few sparse factors, and to iteratively optimize over the factors. This corresponds to training a sparsely connected, linear, deep neural network. Learning algorithms exploiting iterative hard-thresholding have been shown to perform well in practice, a striking example being their ability to somehow “reverse engineer” the fast Hadamard transform. Yet, developing a solid understanding of their conditions of success remains an open challenge.
    In a second part, we study the expressivity of sparsely connected deep networks. Measuring a network’s complexity by its number of connections, we consider the class of functions which error of best approximation with networks of a given complexity decays at a certain rate. Using classical approximation theory, we show that this class can be endowed with a norm that makes it a nice function space, called approximation space. We establish that the presence of certain “skip connections” has no impact of the approximation space, and discuss the role of the network’s nonlinearity (also known as activation function) on the resulting spaces, as well as the benefits of depth.
    For the popular ReLU nonlinearity (as well as its powers), we relate the newly identified spaces to classical Besov spaces, which have a long history as image models associated to sparse wavelet decompositions. The sharp embeddings that we establish highlight how depth enables sparsely connected networks to approximate functions of increased “roughness” (decreased Besov smoothness) compared to shallow networks and wavelets.
    Joint work with Luc Le Magoarou (Inria), Gitta Kutyniok (TU Berlin), Morten Nielsen (Aalborg University) and Felix Voigtlaender (KU Eichstätt).

    JPEG - 12.5 ko
    Rémi GRIBONVAL

    Lieu : FRUMAM - Aix-Marseille Université - Site St Charles
    3, place Victor Hugo - case 39
    13331 MARSEILLE Cedex 03

    Exporter cet événement
    Document(s) associé(s) :

    En savoir plus : Séminaire Signal et Apprentissage

  • Séminaire Singularités

    Jeudi 10 janvier 14:00-15:00 - Helge PEDERSEN - Universidade Federal do Ceará, Brésil

    The Tjurina Transform of determinantal singularities.

    Résumé : Determinantal singularities is a class of singularities that have seen a lot of interest lately, especially the subclass of essentially isolated determinantal singularities (EIDS). It is possible to define two special transformations/modifications on Determinantal singularities, the Tjurina transform and its transpose. The talk will be about defining these and showing basic properties of them. We will first look on the case of generic determinantal singularities, where they are resolutions, give their homotopy type and show their relation with the Nash transform. We will next look at the general case, where we will see that the Tjurina transform is very often a local complete intersection. Lastly we will show how the Tjurina transform can be used to obtain resolutions in some special cases.

    Lieu : FRUMAM - Aix-Marseille Université - Site St Charles
    3, place Victor Hugo - case 39
    13331 MARSEILLE Cedex 03

    Exporter cet événement
    Document(s) associé(s) :

    En savoir plus : Séminaire Singularités

10 janvier 2019: 1 événement

Manifestation scientifique