Distilling importance sampling for likelihood-free inference

Dennis Prangle (University of Bristol)

Friday 17th February, 2023 15:00-16:00 Zoom

Abstract

Likelihood-free inference involves inferring parameter values given observed data and a simulator model. The simulator is computer code taking the parameters, performing stochastic calculations, and outputting simulated data. In this work, we view the simulator as a function whose inputs are (1) the parameters and (2) a vector of pseudo-random draws, and attempt to infer all these inputs. This is challenging as the resulting posterior can be high dimensional and involve strong dependence.

We approximate the posterior using normalizing flows, a flexible parametric family of densities. Training data is generated by ABC importance sampling with a large bandwidth parameter. This is "distilled" by using it to train the normalising flow parameters. The process is iterated, using the updated flow as the importance sampling proposal, and slowly reducing the ABC bandwidth until a proposal is generated for a good approximation to the posterior. Unlike most other likelihood-free methods, we avoid the need to reduce data to low dimensional summary statistics, and hence can achieve more accurate results.

Add to your calendar

Download event information as iCalendar file (only this event)