Accelerating delayed-acceptance Markov chain Monte Carlo algorithms
Preprint, 2019

Delayed-acceptance Markov chain Monte Carlo (DA-MCMC) samples from a probability distribution via a two-stages version of the Metropolis-Hastings algorithm, by combining the target distribution with a "surrogate" (i.e. an approximate and computationally cheaper version) of said distribution. DA-MCMC accelerates MCMC sampling in complex applications, while still targeting the exact distribution. We design a computationally faster, albeit approximate, DA-MCMC algorithm. We consider parameter inference in a Bayesian setting where a surrogate likelihood function is introduced in the delayed-acceptance scheme. When the evaluation of the likelihood function is computationally intensive, our scheme produces a 2-4 times speed-up, compared to standard DA-MCMC. However, the acceleration is highly problem dependent. Inference results for the standard delayed-acceptance algorithm and our approximated version are similar, indicating that our algorithm can return reliable Bayesian inference. As a computationally intensive case study, we introduce a novel stochastic differential equation model for protein folding data.

stochastic differential equation

protein folding

Gaussian process

Bayesian inference

pseudo marginal MCMC


Umberto Picchini

University of Gothenburg

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Samuel Wiqvist

Lund University

Julie Lyng Forman

University of Copenhagen

Kresten Lindorff-Larsen

University of Copenhagen

Wouter Boomsma

University of Copenhagen


Basic sciences

Subject Categories


Probability Theory and Statistics

More information

Latest update