Fully Variational Noise-Contrastive Estimation
Paper in proceeding, 2023

By using the underlying theory of proper scoring rules, we design a family of noise-contrastive estimation (NCE) methods that are tractable for latent variable models. Both terms in the underlying NCE loss, the one using data samples and the one using noise samples, can be lower-bounded as in variational Bayes, therefore we call this family of losses fully variational noise-contrastive estimation. Variational autoencoders are a particular example in this family and therefore can be also understood as separating real data from synthetic samples using an appropriate classification loss. We further discuss other instances in this family of fully variational NCE objectives and indicate differences in their empirical behavior.

Author

Christopher Zach

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

03029743 (ISSN) 16113349 (eISSN)

Vol. 13886 LNCS 175-190
9783031314377 (ISBN)

22nd Scandinavian Conference on Image Analysis, SCIA 2023
Lapland, Finland,

Subject Categories

Information Science

Computer Science

Computer Systems

DOI

10.1007/978-3-031-31438-4_12

More information

Latest update

6/27/2023