Partially Exchangeable Networks and architectures for learning summary statistics in Approximate Bayesian Computation
We present a novel family of deep neural archi-tectures, named partially exchangeable networks(PENs) that leverage probabilistic symmetries.By design, PENs are invariant to block-switchtransformations, which characterize the partial ex-changeability properties of conditionally Marko-vian processes. Moreover, we show that anyblock-switch invariant function has a PEN-likerepresentation. The DeepSets architecture is aspecial case of PEN and we can therefore also tar-get fully exchangeable data. We employ PENs tolearn summary statistics in approximate Bayesiancomputation (ABC). When comparing PENs toprevious deep learning methods for learning sum-mary statistics, our results are highly competitive,both considering time series and static models. In-deed, PENs provide more reliable posterior sam-ples even when using less training data.