Investigating how ReLU-networks encode symmetries
Paper in proceeding, 2023

Many data symmetries can be described in terms of group equivariance and the most common way of encoding group equivariances in neural networks is by building linear layers that are group equivariant. In this work we investigate whether equivariance of a network implies that all layers are equivariant. On the theoretical side we find cases where equivariance implies layerwise equivariance, but also demonstrate that this is not the case generally. Nevertheless, we conjecture that CNNs that are trained to be equivariant will exhibit layerwise equivariance and explain how this conjecture is a weaker version of the recent permutation conjecture by Entezari et al. [2022]. We perform quantitative experiments with VGG-nets on CIFAR10 and qualitative experiments with ResNets on ImageNet to illustrate and support our theoretical findings. These experiments are not only of interest for understanding how group equivariance is encoded in ReLU-networks, but they also give a new perspective on Entezari et al.'s permutation conjecture as we find that it is typically easier to merge a network with a group-transformed version of itself than merging two different networks.

Author

Georg Bökman

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Fredrik Kahl

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Advances in Neural Information Processing Systems

10495258 (ISSN)

Vol. 36

37th Conference on Neural Information Processing Systems, NeurIPS 2023
New Orleans, USA,

Subject Categories

Computer Engineering

Communication Systems

Discrete Mathematics

Areas of Advance

Information and Communication Technology

More information

Latest update

9/26/2024