To the top

Page Manager: Webmaster
Last update: 9/11/2012 3:13 PM

Tell a friend about this page
Print version

Partially Exchangeable Ne… - University of Gothenburg, Sweden Till startsida
Sitemap
To content Read more about how we use cookies on gu.se

Partially Exchangeable Networks and architectures for learning summary statistics in Approximate Bayesian Computation

Conference paper
Authors Samuel Wiqvist
Pierre-Alexandre Mattei
Umberto Picchini
Jes Frellsen
Published in Proceedings of the 36th International Conference on Machine Learning
Publisher PMLR
Publication year 2019
Published at Department of Mathematical Sciences, Applied Mathematics and Statistics
Language en
Links proceedings.mlr.press/v97/wiqvist19...
Keywords deep learning; intractable likelihood; Markov data; time series
Subject categories Statistics, computer and systems science, Mathematical statistics, Statistics, Probability Theory and Statistics

Abstract

We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.

Page Manager: Webmaster|Last update: 9/11/2012
Share:

The University of Gothenburg uses cookies to provide you with the best possible user experience. By continuing on this website, you approve of our use of cookies.  What are cookies?