Date(s) - 20/03/2017
11 h 00 min - 12 h 00 min
Approximate Bayesian Computation (ABC) has grown into a standard methodology to handle Bayesian inference in models associated with intractable likelihood functions.
In a first part, we will show how our ABC Random Forests (RF) methodology can be used to select a model in a Bayesian context. We modify the way Bayesian model selection is both understood and operated, in that we rephrase the inferential goal as a classification problem, first predicting the model that best fits the data with RF and postponing the approximation of the posterior probability of the selected model for a second stage also relying on RF.
Compared with earlier implementations of ABC model choice, the ABC RF approach offers several potential improvements:
(i) it often has a larger discriminative power among the competing models,
(ii) it is more robust against the number and choice of statistics summarizing the data,
(iii) the computing effort is drastically reduced (with a gain in computation efficiency of at least 50) and (iv) it includes an approximation of the posterior probability of the selected model.
In a second part, we will consider parameter estimation questions. We advocate the derivation of a random forest for each component of the parameter vector, a tool from which an approximation to the marginal posterior distribution can be derived. Correlations between parameter components are handled by separate random forests. We will show that this technology offers significant gains in terms of robustness to the choice of the summary statistics and of computing time, when compared with the standard ABC solutions.
In the last part, we will cover some population genetics applications.