Speaker Affiliation :
Date(s) - 27/05/2019
14 h 00 min - 15 h 00 min
Catégories Pas de Catégories
Exponential concentration inequalities are helpful to guaranty that the difference between a parameter and its estimator is no greater than a given threshold, with probability tending exponentially fast to 1 as the sample size increases. Such inequalities are in particular helpful in streaming algorithms, when a sample is obtained in one pass of the file only, and when so-called epsilon-delta approximations are wanted. In a recent work, Bertail and Clemencon (2019) obtained a general exponential inequality for negatively associated sampling designs, a family including rejective sampling, Rao-Sampford sampling and pivotal sampling.
In this work, we define what we call the generalized Sen-Yates-Grundy conditions. Making use of a martingale characterization, we prove that under these conditions the Horvitz-Thompson estimator satisfies a version of the Azuma-Hoeffding therem. These conditions hold true for rejective sampling, Chao’s sampling, Tille’s eliminatory procedure and the generalized Midzuno method, for example.
This is joint work with Mathieu Gerber (University of Bristol).