BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Europe/Paris
X-WR-TIMEZONE:Europe/Paris
BEGIN:VEVENT
UID:7912@i2m.univ-amu.fr
DTSTART;TZID=Europe/Paris:20160201T000000
DTEND;TZID=Europe/Paris:20160205T000000
DTSTAMP:20241221T200358Z
URL:https://www.i2m.univ-amu.fr/evenements/statistical-learning-thematic-m
 onth-2016/
SUMMARY:Conference (CIRM\, Luminy\, Marseille): Statistical learning (Thema
 tic Month 2016)
DESCRIPTION:Conference: \n\n\n\n\n\n\n\n\n\n\n\n\n Schedule \n\n\n\n\n\n Li
 st of participants \n\n\n\n\n\n Sponsors \n\n\n\n\n\n Abstracts \n\n\n\n\n
 \n Slides \n\n\n\n\n\n Videos \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nTHEMATIC MON
 TH 2016\nWeek 1: Statistical learning\nFebruary 1 - 5\, 2016\n\n\n\n\n\n\n
 \nThis week will be devoted to statistical learning from both the theoreti
 cal and applied perspectives. Statistical learning theory has been develop
 ed in the 70's and brought a great revival in statistics. On the one hand\
 , the development of computer science and tools allowed massive data colle
 ction\, and implementation of powerful algorithms which are often memory a
 nd computation time consuming. On the other hand\, classical asymptotic th
 eory used to prove the efficiency of estimation methods in modeling and pr
 ediction was limited by dimensionality problems.\nThe approaches developed
  in statistical learning helped to face some new challenges such as the cu
 rse of dimensionality\, small sample size\, and now massive  datasets. Da
 ta mining\, feature selection\, and more recently Big data appeared as spe
 cialized approaches for massive datasets modeling and analysis.\nIn additi
 on to the theoretical developments (non asymptotic theory)\, many algorith
 ms emerged in statistics and computer science to meet these new needs whic
 h arised in many areas such as bioinformatics\, social sciences\, medicine
  or telecommunications.\nThis week aims to bring together specialists from
  statistical learning working on advanced techniques and coming from diffe
 rent fields\, mainly statistics\, but also computer science\, social scien
 ce and bioinformatics.\nThe main topics of interest include:\n\n 	Supervis
 ed learning\n 	Unsupervised learning: clustering and density estimation\n 
 	Model selection\n 	Big data\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nScientific Commi
 ttee\n\nGérard Biau (Université Pierre-et-Marie-Curie)\nPascal Massart (
 Université Paris-Sud)\nVincent Rivoirard (Université Paris-Dauphine)\n\n
 Organizing Committee\n\nBadih Ghattas (Aix-Marseille Université)\nLiva Ra
 laivola (Aix-Marseille Université)\n\nSpeakers\n\n 	Francis Bach (ENS Par
 is\, France)\n\nLarge-scale machine learning and convex optimization\n\n 	
 Philippe Besse (INSA Toulouse)\n\nApprentissage et données massives\n\n 	
 Gilles Blanchard (Potsdam University)\n\nIs adaptive early stopping possib
 le in statistical inverse problems?\n\n 	Sébastien Bubeck (Princeton Univ
 ersity &amp\; Microsoft Seattle)\n\nEntropy\, geometry\, and a CLT for Wis
 hart matrices\n\n 	Peter Buhlmann (ETH Zurich)\n\nThe power of heterogeneo
 us large-scale data for high-dimensional causal inference\n\n 	Stéphane C
 anu (INSA Rouen)\n\nMixed integer programming for sparse and non convex ma
 chine learning\n\n 	Stéphane Chrétien (National Physical Laboratory\, Te
 ddington\, UK)\n\nA Lagrangian viewpoint on Robust PCA\n\n 	Emilie Devijve
 r (Orsay-Belgique)\n\nBlock-diagonal covariance selection for high-dimensi
 onal Gaussian\ngraphical models\n\n 	Stéphane Gaïffas (Ecole polytechniq
 ue)\n\nStatistical learning with Hawkes processes and new matrix concentra
 tion inequalities\n\n 	Pierre Geurts (Université de Liège)\n\nRandom for
 ests variable importances: towards a better understanding and large-scale 
 feature selection\n\n 	Claire Lacour (Université Paris-Sud)\n\nAbout the 
 Goldenshluger-Lepski methodology for bandwidth selection\n\n 	Matthieu Ler
 asle (Université Nice-Sophia-Antipolis)\n\nSub-Gaussian mean estimators\n
 \n 	Clément Levrard (Université Paris Diderot)\n\nReconstruction simplic
 iale de variétés via l'estimation des plans tangents\n\n 	 Sébastien L
 oustau (Université d'Angers)\n\nQuantization\, Learning and Games with OP
 AC\n\n 	Stéphane Mallat (Ecole polytechnique)\n\nUnderstanding (or not) D
 eep Neural Networks\n\n 	André Mas (Université de Montpellier)\n\nEigenv
 alue-free risk bounds for PCA projectors\n\n 	Patricia Reynaud-Bouret (Uni
 versité Nice-Sophia-Antipolis)\n\nEstimation  of  local  independence
   graphs  via  Hawkes  processes  and link with the  functional neur
 onal connectivity\n\n 	Eric Sibony (Telecom Paris Tech)\n\nA novel multi r
 esolution framework for the statistical analysis of ranking data\n\n 	Gill
 es Stoltz (CNRS\, HEC Paris\, France)\n\nRobust sequential learning with a
 pplications to the forecasting of electricity consumption and of exchange 
 rates\n\n 	Alexandre Tsybakov (CREST-ENSAE)\n\nOracle inequalities for net
 work models and sparse graphon estimation\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\
 n\n\n  \n\n\n\n\nTRUSTEES \n\n\n\n\n\n\n\n\n\n\n\n  \n\n\n\n\n\n\n\n\n  \
 n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\
 n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n  \n\n\n\n\n\n\n\n\n\n\n\n\n\n
CATEGORIES:Colloque,Mois thématique
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:STANDARD
DTSTART:20151025T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
END:VTIMEZONE
END:VCALENDAR