Implicit differentiation of variational quantum algorithms
Preprint, 2022

Several quantities important in condensed matter physics, quantum information, and quantum chemistry, as well as quantities required in meta-optimization of machine learning algorithms, can be expressed as gradients of implicitly defined functions of the parameters characterizing the system. Here, we show how to leverage implicit differentiation for gradient computation through variational quantum algorithms and explore applications in condensed matter physics, quantum machine learning, and quantum information. A function defined implicitly as the solution of a quantum algorithm, e.g., a variationally obtained ground- or steady-state, can be automatically differentiated using implicit differentiation while being agnostic to how the solution is computed. We apply this notion to the evaluation of physical quantities in condensed matter physics such as generalized susceptibilities studied through a variational quantum algorithm. Moreover, we develop two additional applications of implicit differentiation -- hyperparameter optimization in a quantum machine learning algorithm, and the variational construction of entangled quantum states based on a gradient-based maximization of a geometric measure of entanglement. Our work ties together several types of gradient calculations that can be computed using variational quantum circuits in a general way without relying on tedious analytic derivations, or approximate finite-difference methods.

quantum computing

variational quantum algorithms

quantum information

implicit differentiation

quantum chemistry

automatic differentiation

condensed matter physics

machine learning

Author

Shahnawaz Ahmed

Chalmers, Microtechnology and Nanoscience (MC2), Applied Quantum Physics

Nathan Killoran

Xanadu

Juan Carrasquilla

Swiss Federal Institute of Technology in Zürich (ETH)

Vector Institute for AI

University of Waterloo

Roots

Basic sciences

Subject Categories

Condensed Matter Physics

DOI

10.48550/arXiv.2211.13765

More information

Latest update

7/2/2024 3