Two Tales of Single-Phase Contrastive Hebbian Learning
Paper i proceeding, 2024

The search for”biologically plausible” learning algorithms has converged on the idea of representing gradients as activity differences. However, most approaches require a high degree of synchronization (distinct phases during learning) and introduce substantial computational overhead, which raises doubts regarding their biological plausibility as well as their potential utility for neuromorphic computing. Furthermore, they commonly rely on applying infinitesimal perturbations (nudges) to output units, which is impractical in noisy environments. Recently it has been shown that by modelling artificial neurons as dyads with two oppositely nudged compartments, it is possible for a fully local learning algorithm named “dual propagation” to bridge the performance gap to backpropagation, without requiring separate learning phases or infinitesimal nudging. However, the algorithm has the drawback that its numerical stability relies on symmetric nudging, which may be restrictive in biological and analog implementations. In this work we first provide a solid foundation for the objective underlying the dual propagation method, which also reveals a surpising connection with adversarial robustness. Second, we demonstrate how dual propagation is related to a particular adjoint state method, which is stable regardless of asymmetric nudging.

Författare

Rasmus Kjær Høier

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Christopher Zach

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Proceedings of Machine Learning Research

26403498 (eISSN)

Vol. 235 18470-18488

41st International Conference on Machine Learning, ICML 2024
Vienna, Austria,

Ämneskategorier

Data- och informationsvetenskap

Annan teknik

Mer information

Senast uppdaterat

2024-11-06