Contrastive learning for lifted networks
Paper i proceeding, 2020

In this work we address supervised learning via lifted network formulations. Lifted networks are interesting because they allow training on massively parallel hardware and assign energy models to discriminatively trained neural networks. We demonstrate that training methods for lifted networks proposed in the literature have significant limitations, and therefore we propose to use a contrastive loss to train lifted networks. We show that this contrastive training approximates back-propagation in theory and in practice, and that it is superior to the regular training objective for lifted networks.

Författare

Christopher Zach

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Virginia Estellers

Microsoft

30th British Machine Vision Conference 2019, BMVC 2019

30th British Machine Vision Conference, BMVC 2019
Cardiff, United Kingdom,

Ämneskategorier

Datorteknik

Kommunikationssystem

Datorsystem

Mer information

Senast uppdaterat

2020-09-03