Truncated Inference for Latent Variable Optimization Problems: Application to Robust Estimation and Learning
Paper in proceeding, 2020

Optimization problems with an auxiliary latent variable structure in addition to the main model parameters occur frequently in computer vision and machine learning. The additional latent variables make the underlying optimization task expensive, either in terms of memory (by maintaining the latent variables), or in terms of runtime (repeated exact inference of latent variables). We aim to remove the need to maintain the latent variables and propose two formally justified methods, that dynamically adapt the required accuracy of latent variable inference. These methods have applications in large scale robust estimation and in learning energy-based models from labeled data.

Truncated Inference

Robust Estimation

Majorization Minimization

Large-scale optimization

Author

Christopher Zach

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Huu Le

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 12371 LNCS 464-480
9783030585730 (ISBN)

16th European Conference on Computer Vision
Glasgow, United Kingdom,

Subject Categories

Computational Mathematics

Bioinformatics (Computational Biology)

Probability Theory and Statistics

DOI

10.1007/978-3-030-58574-7_28

More information

Latest update

12/18/2020