Implicit Transfer Operator Learning: Multiple Time-Resolution Models for Molecular Dynamics
Paper in proceeding, 2023

Computing properties of molecular systems rely on estimating expectations of the (unnormalized) Boltzmann distribution. Molecular dynamics (MD) is a broadly adopted technique to approximate such quantities. However, stable simulations rely on very small integration time-steps (10^{-15}s), whereas convergence of some moments, e.g. binding free energy or rates, might rely on sampling processes on time-scales as long as 10^{-1}s, and these simulations must be repeated for every molecular system independently. Here, we present Implict Transfer Operator (ITO) Learning, a framework to learn surrogates of the simulation process with multiple time-resolutions. We implement ITO with denoising diffusion probabilistic models with a new SE(3) equivariant architecture and show the resulting models can generate self-consistent stochastic dynamics across multiple time-scales, even when the system is only partially observed. Finally, we present a coarse-grained CG-SE3-ITO model which can quantitatively model all-atom molecular dynamics using only coarse molecular representations. As such, ITO provides an important step towards multiple time- and space-resolution acceleration of MD. Code is available at https://github.com/olsson-group/ito.

Author

Jacob Mathias Schreiner

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Ole Winther

Technical University of Denmark (DTU)

Simon Olsson

Chalmers, Computer Science and Engineering (Chalmers), Data Science and AI

Advances in Neural Information Processing Systems

10495258 (ISSN)

Vol. 36

37th Conference on Neural Information Processing Systems, NeurIPS 2023
New Orleans, USA,

Infrastructure

C3SE (Chalmers Centre for Computational Science and Engineering)

Subject Categories

Bioinformatics (Computational Biology)

More information

Latest update

9/26/2024