Differentiating Metropolis-Hastings to Optimize Intractable Densities
Other conference contribution, 2023

We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers, allowing us to differentiate through probabilistic inference, even if the model has discrete components within it. Our approach fuses recent advances in stochastic automatic differentiation with traditional Markov chain coupling schemes, providing an unbiased and low-variance gradient estimator. This allows us to apply gradient-based optimization to objectives expressed as expectations over intractable target densities. We demonstrate our approach by finding an ambiguous observation in a Gaussian mixture model and by maximizing the specific heat in an Ising model.

Metropolis-Hastings

Markov chain coupling

automatic differentiation

Author

Gaurav Arya

Massachusetts Institute of Technology (MIT)

Ruben Seyer

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

University of Gothenburg

Frank Schäfer

Massachusetts Institute of Technology (MIT)

Kartik Chandra

Massachusetts Institute of Technology (MIT)

Alexander Lew

Massachusetts Institute of Technology (MIT)

Mathieu Huot

University of Oxford

Vikash Mansinghka

Massachusetts Institute of Technology (MIT)

Jonathan Ragan-Kelley

Massachusetts Institute of Technology (MIT)

Christopher Vincent Rackauckas

Massachusetts Institute of Technology (MIT)

JuliaHub, Inc.

Pumas-AI Inc.

Moritz Schauer

University of Gothenburg

Chalmers, Mathematical Sciences, Applied Mathematics and Statistics

Differentiable Almost Everything Workshop of the 40th International Conference on Machine Learning
Honolulu, Hawaii, USA,

Subject Categories

Computational Mathematics

Transport Systems and Logistics

Probability Theory and Statistics

Roots

Basic sciences

More information

Latest update

8/23/2024