RydbergGPT
Journal article, 2025

We introduce a generative pretrained transformer (GPT) designed to learn the measurement outcomes of a neutral atom array quantum computer. Based on a vanilla transformer, our encoder–decoder architecture takes as input the interacting Hamiltonian, and outputs an autoregressive sequence of qubit measurement probabilities. Its performance is studied in the vicinity of a quantum phase transition in Rydberg atoms in a square lattice array. We explore the model’s generalization capabilities by demonstrating that it can accurately predict ground-state measurement outcomes for Hamiltonian parameter values that were not included in the training data. We evaluate three model variants, each trained for a fixed duration on a single NVIDIA A100 GPU, by examining their predictions of key physical observables. These results establish performance benchmarks for scaling to larger RydbergGPT models. These can act as benchmarks for the scaling of larger RydbergGPT models in the future. Finally, we release RydbergGPT as open-source software to facilitate the development of foundation models for diverse quantum computing platforms and datasets.

neutral atom arrays

machine learning in physics

machine learning

Rydberg atoms

quantum computing

generative pretrained transformer

Author

David Fitzek

Volvo Group

Chalmers, Microtechnology and Nanoscience (MC2), Applied Quantum Physics

Yi Hong Teoh

University of Waterloo

Cyrus P.C. Fung

University of Waterloo

Gebremedhin A. Dagnew

University of Waterloo

Ejaaz Merali

University of Waterloo

M. Schuyler Moss

University of Waterloo

Benjamin MacLellan

University of Waterloo

Roger G. Melko

Perimeter Institute for Theoretical Physics

University of Waterloo

Machine Learning: Science and Technology

2632-2153 (eISSN)

Vol. 6 4 045057

WACQT Quantum Technology Testbed

Knut and Alice Wallenberg Foundation (KAW2022.0332,KAW2023.0393), 2023-05-01 -- 2028-04-01.

Subject Categories (SSIF 2025)

Software Engineering

DOI

10.1088/2632-2153/ae1d0b

More information

Latest update

12/12/2025