Mathematical Foundations of Equivariant Neural Networks
Doktorsavhandling, 2023
Geometric deep learning aims to reduce the amount of information that neural networks have to learn, by taking advantage of geometric properties in data. In particular, equivariant neural networks use symmetries to reduce the complexity of a learning task. Symmetries are properties that do not change under certain transformations. For example, rotation-equivariant neural networks trained to identify tumors in medical images are not sensitive to the orientation of a tumor within an image. Another example is graph neural networks, i.e., permutation-equivariant neural networks that operate on graphs, such as molecules or social networks. Permuting the ordering of vertices and edges either transforms the output of a graph neural network in a predictable way (equivariance), or has no effect on the output (invariance).
In this thesis we study a fiber bundle theoretic framework for equivariant neural networks. Fiber bundles are often used in mathematics and theoretical physics to model nontrivial geometries, and offer a geometric approach to symmetry. This framework connects to many different areas of mathematics, including Fourier analysis, representation theory, and gauge theory, thus providing a large set of tools for analyzing equivariant neural networks.
symmetry
geometric deep learning
gauge theory
induced representations
fiber bundles
convolutional neural networks
equivariance
Författare
Jimmy Aronsson
Chalmers, Matematiska vetenskaper, Algebra och geometri
Ämneskategorier
Matematik
ISBN
978-91-7905-854-8
Doktorsavhandlingar vid Chalmers tekniska högskola. Ny serie: 5320
Utgivare
Chalmers
Pascal, Chalmers tvärgata 3
Opponent: Prof. Erik Bekkers, Amsterdam Machine Learning Lab (AMLab), University of Amsterdam, the Netherlands