Flopping for FLOPs: Leveraging Equivariance for Computational Efficiency
Paper i proceeding, 2025

Incorporating geometric invariance into neural networks enhances parameter efficiency but typically increases computational costs. This paper introduces new equivariant neural networks that preserve symmetry while maintaining a comparable number of floating-point operations (FLOPs) per parameter to standard non-equivariant networks. We focus on horizontal mirroring (flopping) invariance, common in many computer vision tasks. The main idea is to parametrize the feature spaces in terms of mirror-symmetric and mirror-antisymmetric features, i.e., irreps of the flopping group. This decomposes the linear layers to be block-diagonal, requiring half the number of FLOPs. Our approach reduces both FLOPs and wall-clock time, providing a practical solution for efficient, scalable symmetry-aware architectures.

BCE

Författare

Georg Bökman

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

David Nordström

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Fredrik Kahl

Chalmers, Elektroteknik, Signalbehandling och medicinsk teknik

Proceedings of the 38th Conference on Uncertainty in Artificial Intelligence, UAI 2022

26403498 (eISSN)

Vol. 267 4823-4838

42nd International Conference on Machine Learning, ICML 2025
Vancouver, Canada,

Ämneskategorier (SSIF 2025)

Kommunikationssystem

Datavetenskap (datalogi)

Mer information

Senast uppdaterat

2025-12-11