Thrilled to share what I’ve been working on for the last two years - a new way to solve one of the most fundamental problems in quantum physics, computing excited states! https://arxiv.org/abs/2308.16848
Natural Quantum Monte Carlo Computation of Excited States

We present a variational Monte Carlo algorithm for estimating the lowest excited states of a quantum system which is a natural generalization of the estimation of ground states. The method has no free parameters and requires no explicit orthogonalization of the different states, instead transforming the problem of finding excited states of a given system into that of finding the ground state of an expanded system. Expected values of arbitrary observables can be calculated, including off-diagonal expectations between different states such as the transition dipole moment. Although the method is entirely general, it works particularly well in conjunction with recent work on using neural networks as variational Ansatze for many-electron systems, and we show that by combining this method with the FermiNet and Psiformer Ansatze we can accurately recover vertical excitation energies and oscillator strengths on molecules as large as benzene. Beyond the examples on molecules presented here, we expect this technique will be of great interest for applications of variational quantum Monte Carlo to atomic, nuclear and condensed matter physics.

arXiv.org

When a quantum system is stimulated with energy, particles get kicked into higher energy states - going from the “ground state” to an “excited state” - and then fall back down to lower states, sometimes releasing light in the process.

(Source image: Quantum Physics for Babies)

This happens in lasers, semiconductors, LEDs, solar panels, fluorescent dyes, and all sorts of chemistry involving light, including the proteins in your eye responsible for vision. Lots of the exotic stuff in quantum mechanics can only be understood through excited states.

How can we calculate the properties of excited states? Deep learning has been used for very accurate quantum calculations, but the basic method behind these calculations, VMC, is almost 60 years old.

VMC works well for ground states, but despite decades of work, no one knows the right way to extend VMC for excited states. With only ground states, it’s like you’re trying to understand how a marble moves in a well when you only know where the bottom of the well is.

https://www.nature.com/articles/s41570-023-00516-8

Ab initio quantum chemistry with neural-network wavefunctions - Nature Reviews Chemistry

Quantum Monte Carlo methods using neutral-network ansatzes can provide virtually exact solutions to the electronic Schrödinger equations for small systems and are comparable to conventional quantum chemistry methods when investigating systems with dozens of electrons.

Nature

We’ve figured out a new way to compute excited states with VMC which has none of the drawbacks of earlier methods. It has no free parameters, allows unbiased estimation of gradients and energies, and does not require different states to be explicitly orthogonal.

We do this by transforming the problem of finding many excited states of a system into the problem of finding the ground state of a generalized system. Then you can just use all the normal VMC machinery for computing ground states!

We run our method through a battery of tests, and find that by combining deep learning methods with our new, natural method for computing excited states, we are able to accurately compute all sorts of properties of excited states for molecules as big as benzene. It works!
Stepping back a bit, I’ve found it very gratifying to work on - and make a new contribution to - a decades-old open problem in computational physics, at a time when it feels like everyone else in AI is rushing to work on whatever the coolest thing at the moment is. Excited states of quantum systems might be more niche than LLMs, but I really believe that in 10 or 20 years people will still be using our method. Whatever is big in AI today will almost surely be forgotten or supplanted by then.
So massive thanks to Simon Axelrod, Halvard Sutterud, Ingrid von Glehn and James Spencer (are any of them on this app?) as well as everyone who put up with all my naive questions! Let’s see what we can do with this!
@pfau honestly I didn't know deepmind was also working on these kinds of questions. Looks really interesting!