BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Europe/Paris
X-WR-TIMEZONE:Europe/Paris
BEGIN:VEVENT
UID:6219@i2m.univ-amu.fr
DTSTART;TZID=Europe/Paris:20211213T000000
DTEND;TZID=Europe/Paris:20211217T000000
DTSTAMP:20241211T155307Z
URL:https://www.i2m.univ-amu.fr/evenements/meeting-in-mathematical-statist
 ics-2021/
SUMMARY:Conference (CIRM\, Luminy\, Marseille): Meeting in Mathematical Sta
 tistics 2021
DESCRIPTION:Conference: \n\n\n\n\n\n\n Time Schedule - Abstracts\n\n\n\n Pa
 rticipants \n\n\n\n\n\n Videos (after the event) \n\n\n\n\n\n Archives MMS
  2020 \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nMULTIYEAR PROGRAM\nCONFERENCE\n​
 \nMeeting in Mathematical Statistics   /   Rencontres de Statistique M
 athématique\nMachine learning and nonparametric statistics\n13 - 17 Decem
 ber 2021\n\n\n\n\n  \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nScientific Comm
 ittee &amp\; Organizing Committee\nComité scientifique &amp\; Comité d
 'organisation\nCristina Butucea (Université Paris-Est Marne-la-Vallée)\n
 Stanislav Minsker (University of Southern California)\nChristophe Pouet 
  (École Centrale de Marseille)\nVladimir Spokoiny (Humboldt University 
 of Berlin)\n\n\n\n\n\n\n  \n\n\n\n\n\n\n\n\n\nDescription\nContemporary ma
 chine learning algorithms define the state of the art in diverse areas (co
 mputer vision\, robotics and speech recognition\, to name a few)\, but in 
 many cases theoretical justification behind the success of these methods i
 s still missing. Mathematical results\, in particular statistical and prob
 abilistic properties\, are being actively developed\, but many challenges 
 still remain. Deep learning and generative models are particular examples 
 of the areas with significant gaps between the engineering success and the
 oretical understanding. To fill this gap\, tools from diverse areas such a
 s nonparametric statistics\, approximation theory\, empirical process theo
 ry and computational efficiency are needed. This conference aims at establ
 ishing new fruitful collaborations among the experts in nonparametric stat
 istics and theoretical computer science. Expected outcome of such collabor
 ations are new developments in the theory of machine learning\, including 
 the topics such as deep learning\, robustness\, privacy and estimation und
 er fairness constraints.\n\nLectures\nPeter Bartlett (UC Berkeley)   Be
 nign overfitting and adversarial examples  (abstract)\nGabor Lugosi (Pom
 peu Fabra University\, Barcelona)  Network archeology: a few results and
  questions  (abstract)\n\nTalks\nArya Akhavan (Iit - Ensae)   Distribute
 d Zero-Order Optimization under Adversarial Noise\nRandolf Altmeyer (Unive
 rsity of Cambridge)   Statistical and computational guarantees for sampli
 ng from high dimensional posterior distributions\nDenis Belomestny (Univer
 sity Of Duisburg)   Rates of convergence for density estimation with gene
 rative adversarial networks\nAnnika Betken (University Of Twente)  Combin
 ing rank statistics and subsampling for a solution to the change-point pro
 blem in time series analysis\nGilles Blanchard (Université Paris-Saclay) 
   Fast rates for prediction with limited expert advice\nTimothy Cannings 
 (University Of Edinburgh)   Adaptive Transfer Learning\nArnak Dalalyan (C
 rest-Ensae)   Statistical guarantees for generative models\nFarida Enikee
 va (Université De Poitiers)   Change-Point Detection in Dynamic Network
 s with Missing Links\nSubhodh Kotekal (University Of Chicago)   Minimax 
 rates for sparse signal detection under correlation\nMatthias Löffler (Et
 h Zürich)   AdaBoost and robust one-bit compressed sensing\nBéatrice La
 urent-Bonneau (Insa De Toulouse)   Aggregated tests of independence based
  on HSIC measures\nTengyuan Liang (University Of Chicago)   Universal Pre
 diction Band\, Semi-Definite Programming and Variance Interpolation\nArsha
 k Minasyan (Crest-Ensae)   All-In-One Robust Estimator of the Gaussian Me
 an \nMohamed Ndaoud (Essec)  Minimax Supervised Clustering in the Anisot
 ropic Gaussian Mixture Model: A new take on Robust Interpolation\nVianney 
 Perchet (Ensae &amp\; Criteo AI Lab)   Active learning and/or online sig
 n identification\nKolyan Ray (Imperial College London)   Bayesian inferen
 ce for multi-dimensional diffusions\nMarkus Reiß (Humboldt University Ber
 lin)   Inference on the maximal rank of time-varying covariance matrices 
 using high-frequency data\nLionel Riou-Durand (University Of Warwick)   M
 etropolis Adjusted Underdamped Langevin Trajectories\nEtienne Roquain (Sor
 bonne Université)   Some transition boundaries for multiple testing wi
 th unknown null distribution\nRichard Samworth (University Of Cambridge) 
   Optimal subgroup selection\nGeorge Stepaniants (Massachusetts Institute
  Of Technology Learning)   Partial Differential Equations in Reproducing 
 Kernel Hilbert Spaces\nBotond Tibor Szabo (Bocconi University) Optimal dis
 tributed testing under communication constraints in high-dimensional and n
 onparametric Gaussian white noise model\nMathias Trabs (Karlsruhe Institut
 e Of Technology)   Dispersal density estimation across scales\nNikita Zhi
 votovskiy (Eth)   Stability and Generalization: Some recent results\n
CATEGORIES:Colloque
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:STANDARD
DTSTART:20211031T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
END:VTIMEZONE
END:VCALENDAR