Implicit differentiation of variational quantum algorithms
Preprint, 2022
Several quantities important in condensed matter physics, quantum information, and quantum chemistry, as well as quantities required in meta-optimization of machine learning algorithms, can be expressed as gradients of implicitly defined functions of the parameters characterizing the system. Here, we show how to leverage implicit differentiation for gradient computation through variational quantum algorithms and explore applications in condensed matter physics, quantum machine learning, and quantum information. A function defined implicitly as the solution of a quantum algorithm, e.g., a variationally obtained ground- or steady-state, can be automatically differentiated using implicit differentiation while being agnostic to how the solution is computed. We apply this notion to the evaluation of physical quantities in condensed matter physics such as generalized susceptibilities studied through a variational quantum algorithm. Moreover, we develop two additional applications of implicit differentiation -- hyperparameter optimization in a quantum machine learning algorithm, and the variational construction of entangled quantum states based on a gradient-based maximization of a geometric measure of entanglement. Our work ties together several types of gradient calculations that can be computed using variational quantum circuits in a general way without relying on tedious analytic derivations, or approximate finite-difference methods.
quantum computing
variational quantum algorithms
quantum information
implicit differentiation
quantum chemistry
automatic differentiation
condensed matter physics
machine learning