Contributions to approximate Bayesian Inference

Alexander Buchholz (MRC Biostatistics Unit)

Friday 6th December, 2019 15:00-16:00 Maths 311B

Abstract

The Bayesian paradigm allows to carefully assess uncertainty in the predicted outcome in machine learning and statistical models. However, computing posterior distributions remains a difficult problem that has lead to approximate methods like MCMC sampling or Variational Inference approaches. In this talk I will present two recent contributions to the field of Bayesian computation. First I will introduce an improved learning algorithm for Variational Inference that is based on a variance reduction of stochastic gradients using quasi-Monte Carlo sampling. The improved algorithm has strong guarantees and can readily be implemented in existing frameworks while speeding up computations substantially. The second part of the talk will be dedicated to Bayesian model choice in large data settings. By using a divide-and-conquer approach that splits an initial dataset in chunks I achieve huge improvements in computing time that allow to run Bayesian inference algorithms in parallel with restricted communication. The suggested approach is applicable in settings that require privacy preservation.

https://arxiv.org/abs/1807.01604

https://arxiv.org/abs/1910.04672

Add to your calendar

Download event information as iCalendar file (only this event)