Contrastive learning for lifted networks
Paper in proceeding, 2020

In this work we address supervised learning via lifted network formulations. Lifted networks are interesting because they allow training on massively parallel hardware and assign energy models to discriminatively trained neural networks. We demonstrate that training methods for lifted networks proposed in the literature have significant limitations, and therefore we propose to use a contrastive loss to train lifted networks. We show that this contrastive training approximates back-propagation in theory and in practice, and that it is superior to the regular training objective for lifted networks.

Author

Christopher Zach

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Virginia Estellers

Microsoft

30th British Machine Vision Conference 2019, BMVC 2019

30th British Machine Vision Conference, BMVC 2019
Cardiff, United Kingdom,

Subject Categories

Computer Engineering

Communication Systems

Computer Systems

More information

Latest update

9/3/2020 1