Partially Exchangeable Networks and architectures for learning summary statistics in Approximate Bayesian Computation
Paper i proceeding, 2019

We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.

intractable likelihoods

deep learning

parameter inference

time series

Författare

Samuel Wiqvist

Lunds universitet

Pierre-Alexandre Mattei

IT-Universitetet i Kobenhavn

Umberto Picchini

Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik

Jes Frellsen

IT-Universitetet i Kobenhavn

Proceedings of the 36th International Conference on Machine Learning

Vol. 97 6798-6807

Ämneskategorier

Annan data- och informationsvetenskap

Sannolikhetsteori och statistik

Mer information

Senast uppdaterat

2019-05-29