We construct new methods to produce accurate Bayesian statistical inference for parameters in stochastic dynamical models. Mathematical models for real-world applications are complex enough that exact statistical inference is usually unattainable. So-called approximate Bayesian computation (ABC ) provides a viable alternative for models not having an analytically tractable likelihood function, by using computer-simulations. However, to achieve satisfactory accuracy, ABC requires informative summary statistics of the simulated and observed data. We will exploit the expressive power of deep learning (DL), and construct new deep neural networks that (i) automatically return the required summary statistics, and (ii) provide summary statistics that are more informative than alternative methods. (iii) We will also construct efficient ways to train the neural networks. By marrying ABC with DL we obtain a powerful, plug-and-play inference tool for dynamical models. We specifically target inference for state-space models and stochastic differential equations. These models are very widely used in practical applications, however standard inference tools require a high level of technical expertise and are difficult to generalize. Our DL-ABC strategy only requires forward-simulations from the computer model, thus offering a flexible, simulations-based approach. The range of possible applications is endless, and we consider three challenging case studies.
Senior Lecturer at Chalmers, Mathematical Sciences, Applied Mathematics and Statistics
Funding Chalmers participation during 2020–2024