Investigating how ReLU-networks encode symmetries
Paper i proceeding, 2023

Many data symmetries can be described in terms of group equivariance and the most common way of encoding group equivariances in neural networks is by building linear layers that are group equivariant. In this work we investigate whether equivariance of a network implies that all layers are equivariant. On the theoretical side we find cases where equivariance implies layerwise equivariance, but also demonstrate that this is not the case generally. Nevertheless, we conjecture that CNNs that are trained to be equivariant will exhibit layerwise equivariance and explain how this conjecture is a weaker version of the recent permutation conjecture by Entezari et al. [2022]. We perform quantitative experiments with VGG-nets on CIFAR10 and qualitative experiments with ResNets on ImageNet to illustrate and support our theoretical findings. These experiments are not only of interest for understanding how group equivariance is encoded in ReLU-networks, but they also give a new perspective on Entezari et al.'s permutation conjecture as we find that it is typically easier to merge a network with a group-transformed version of itself than merging two different networks.

Författare

Georg Bökman

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Fredrik Kahl

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Advances in Neural Information Processing Systems

10495258 (ISSN)

Vol. 36

37th Conference on Neural Information Processing Systems, NeurIPS 2023
New Orleans, USA,

Ämneskategorier

Datorteknik

Kommunikationssystem

Diskret matematik

Styrkeområden

Informations- och kommunikationsteknik

Mer information

Senast uppdaterat

2024-08-07