Title: Recent developments on unbiased Monte Carlo methods
Abstract: Monte Carlo estimators, based on Markov chains or interacting particle systems, are typically biased when run with a finite number of iterations (or a finite number of particles). Although this is usually considered unavoidable, and negligible in the usual asymptotic sense, it is an important obstacle on the path towards scalable numerical integration on large-scale distributed computing systems. In a series of works that build on the seminal paper of Glynn and Rhee (2014), a number of collaborators and I construct couplings of Markov chains that lead to unbiased estimators that can be computed in a finite (random) cost. I will describe the construction for some MCMC algorithms (Metropolis-Hastings, Gibbs, Hamiltonian Monte Carlo). I will then describe how particle methods (annealed importance samplers, sequential Monte Carlo samplers) can be "de-biased" as well, in a generic way. Beyond the main motivation for parallel algorithms, I will also mention different tasks that can benefit from unbiased Monte Carlo estimators, such as modularized Bayesian inference, normalizing constant estimation, and Bayesian cross-validation.