Monte Carlo Filtering Objectives
Paper i proceeding, 2021

Learning generative models and inferring latent trajectories have shown to be challenging for time series due to the intractable marginal likelihoods of flexible generative models. It can be addressed by surrogate objectives for optimization. We propose Monte Carlo filtering objectives (MCFOs), a family of variational objectives for jointly learning parametric generative models and amortized adaptive importance proposals of time series. MCFOs extend the choices of likelihood estimators beyond Sequential Monte Carlo in state-of-the-art objectives, possess important properties revealing the factors for the tightness of objectives, and allow for less biased and variant gradient estimates. We demonstrate that the proposed MCFOs and gradient estimations lead to efficient and stable model learning, and learned generative models well explain data and importance proposals are more sample efficient on various kinds of time series data.

Monte Carlo methods

Learning systems

Artificial intelligence

Författare

Shuangshuang Chen

Kungliga Tekniska Högskolan (KTH)

Volvo

Sihao Ding

Volvo

Yiannis Karayiannidis

Chalmers, Elektroteknik, System- och reglerteknik

Marten Bjorkman

Kungliga Tekniska Högskolan (KTH)

IJCAI International Joint Conference on Artificial Intelligence

10450823 (ISSN)

2256-2262
9780999241196 (ISBN)

30th International Joint Conference on Artificial Intelligence, IJCAI 2021
Virtual, Online, Canada,

Ämneskategorier

Bioinformatik (beräkningsbiologi)

Sannolikhetsteori och statistik

Datorseende och robotik (autonoma system)

Mer information

Senast uppdaterat

2022-03-14