Monte Carlo Filtering Objectives
Paper in proceeding, 2021

Learning generative models and inferring latent trajectories have shown to be challenging for time series due to the intractable marginal likelihoods of flexible generative models. It can be addressed by surrogate objectives for optimization. We propose Monte Carlo filtering objectives (MCFOs), a family of variational objectives for jointly learning parametric generative models and amortized adaptive importance proposals of time series. MCFOs extend the choices of likelihood estimators beyond Sequential Monte Carlo in state-of-the-art objectives, possess important properties revealing the factors for the tightness of objectives, and allow for less biased and variant gradient estimates. We demonstrate that the proposed MCFOs and gradient estimations lead to efficient and stable model learning, and learned generative models well explain data and importance proposals are more sample efficient on various kinds of time series data.

Monte Carlo methods

Learning systems

Artificial intelligence

Author

Shuangshuang Chen

Royal Institute of Technology (KTH)

Volvo

Sihao Ding

Volvo

Yiannis Karayiannidis

Chalmers, Electrical Engineering, Systems and control, Mechatronics

Marten Bjorkman

Royal Institute of Technology (KTH)

IJCAI International Joint Conference on Artificial Intelligence

10450823 (ISSN)

2256-2262
9780999241196 (ISBN)

30th International Joint Conference on Artificial Intelligence, IJCAI 2021
Virtual, Online, Canada,

Subject Categories

Bioinformatics (Computational Biology)

Probability Theory and Statistics

Computer Vision and Robotics (Autonomous Systems)

More information

Latest update

3/14/2022