Partially Exchangeable Networks and architectures for learning summary statistics in Approximate Bayesian Computation
Preprint, 2019

We present a novel family of deep neural archi-tectures, named partially exchangeable networks(PENs) that leverage probabilistic symmetries.By design, PENs are invariant to block-switchtransformations, which characterize the partial ex-changeability properties of conditionally Marko-vian processes. Moreover, we show that anyblock-switch invariant function has a PEN-likerepresentation. The DeepSets architecture is aspecial case of PEN and we can therefore also tar-get fully exchangeable data. We employ PENs tolearn summary statistics in approximate Bayesiancomputation (ABC). When comparing PENs toprevious deep learning methods for learning sum-mary statistics, our results are highly competitive,both considering time series and static models. In-deed, PENs provide more reliable posterior sam-ples even when using less training data.

intractable likelihood

time series

deep learning


Samuel Wiqvist

Lunds universitet

Pierre-Alexandre Mattei

IT-Universitetet i Kobenhavn

Umberto Picchini

Chalmers, Matematiska vetenskaper, Tillämpad matematik och statistik

Jes Frellsen

IT-Universitetet i Kobenhavn


Annan data- och informationsvetenskap


Sannolikhetsteori och statistik

Mer information

Senast uppdaterat