In search of projectively equivariant networks
Artikel i vetenskaplig tidskrift, 2023

Equivariance of linear neural network layers is well studied. In this work, we relax the equivariance condition to only be true in a projective sense. Hereby, we introduce the topic of projective equivariance to the machine learning audience. We theoretically study the relation of projectively and linearly equivariant linear layers. We find that in some important cases, surprisingly, the two types of layers coincide. We also propose a way to construct a projectively equivariant neural network, which boils down to building a standard equivariant network where the linear group representations acting on each intermediate feature space are lifts of projective group representations. Projective equivari-ance is showcased in two simple experiments. Code for the experiments is provided at github.com/usinedepain/projectively_equivariant_deep_nets.

Författare

Georg Bökman

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Axel Flinth

Umeå universitet

Fredrik Kahl

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Transactions on Machine Learning Research

28358856 (eISSN)

Vol. 2023

Ämneskategorier (SSIF 2025)

Matematik

Data- och informationsvetenskap (Datateknik)

Mer information

Senast uppdaterat

2025-03-19