Algebraic Positional Encodings
Paper i proceeding, 2024

We introduce a novel positional encoding strategy for Transformer-style models, addressing the shortcomings of existing, often ad hoc, approaches. Our framework implements a flexible mapping from the algebraic specification of a domain to a positional encoding scheme where positions are interpreted as orthogonal operators. This design preserves the structural properties of the source domain, thereby ensuring that the end-model upholds them. The framework can accommodate various structures, including sequences, grids and trees, but also their compositions. We conduct a series of experiments demonstrating the practical applicability of our method. Our results suggest performance on par with or surpassing the current state of the art, without hyper-parameter optimizations or task search'' of any kind

Författare

Konstantinos Kogkalidis

Aalto-Yliopisto

Universita di Bologna

Jean-Philippe Bernardy

Chalmers, Data- och informationsteknik, Computing Science

Göteborgs universitet

V. Garg

Aalto-Yliopisto

YaiYai

Advances in Neural Information Processing Systems

10495258 (ISSN)

Vol. 37

Advances in Neural Information Processing Systems 38
Vancouver, Canada,

Ämneskategorier (SSIF 2025)

Data- och informationsvetenskap (Datateknik)

Relaterade dataset

Code Algebraic Positional Encodings [dataset]

URI: https://aalto-quml.github.io/ape/

Mer information

Senast uppdaterat

2025-04-03