Bayesian Posterior Approximation With Stochastic Ensembles
Paper in proceeding, 2023

We introduce ensembles of stochastic neural networks to approximate the Bayesian posterior, combining stochastic methods such as dropout with deep ensembles. The stochas-tic ensembles are formulated as families of distributions and trained to approximate the Bayesian posterior with variational inference. We implement stochastic ensembles based on Monte Carlo dropout, DropConnect and a novel non-parametric version of dropout and evaluate them on a toy problem and CIFAR image classification. For both tasks, we test the quality of the posteriors directly against Hamil-tonian Monte Carlo simulations. Our results show that stochastic ensembles provide more accurate posterior esti-mates than other popular baselines for Bayesian inference.

Deep learning architectures and techniques

Author

Oleksandr Balabanov

Stockholm University

Bernhard Mehlig

University of Gothenburg

Hampus Linander

Chalmers, Mathematical Sciences, Algebra and geometry

University of Gothenburg

Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

10636919 (ISSN)

Vol. 2023-June 13701-13711
9798350301298 (ISBN)

2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2023
Vancouver, Canada,

Subject Categories

Computational Mathematics

Probability Theory and Statistics

DOI

10.1109/CVPR52729.2023.01317

More information

Latest update

10/24/2023