Distilling importance sampling for likelihood-free inference

Dennis Prangle

University of Bristol

 

Zoom

11 janeiro 2023 (4.ª feira) – 14h:00m

Abstract:

Likelihood-free inference involves inferring parameter values given observed data and a simulator model. The simulator is computer code taking the parameters, performing stochastic calculations, and outputting simulated data. In this work, we view the simulator as a function whose inputs are (1) the parameters and (2) a vector of pseudo-random draws, and attempt to infer all these inputs. This is challenging as the resulting posterior can be high dimensional and involve strong dependence.

We approximate the posterior using normalizing flows, a flexible parametric family of densities. Training data is generated by ABC importance sampling with a large bandwidth parameter. This is “distilled” by using it to train the normalising flow parameters. The process is iterated, using the updated flow as the importance sampling proposal, and slowly reducing the ABC bandwidth until a proposal is generated for a good approximation to the posterior. Unlike most other likelihood-free methods, we avoid the need to reduce data to low dimensional summary statistics, and hence can achieve more accurate results.

Short bio:

Dennis Prangle is a senior lecturer in statistics at the University of Bristol. His current research is on the interface between Bayesian statistics and machine learning. He is particularly interested in developing approximate inference methods such as approximate Bayesian computation approaches and variational inference. One application is to likelihood-free inference, where simulation of data is possible but the likelihood function is unavailable. Another research interest is experimental design and how to quickly derive effective high dimensional designs. He has also been working on applications to population genetics, physics, ecology and epidemiology.

A joint CEAUL / CEMAT