Speaker
Description
In recent years, Neural (network) Quantum States (NQS) emerged as powerful and reliable variational ansätze for studying strongly correlated quantum many-body systems. Since then, architectures such as restricted Boltzmann machines, recurrent neural networks, and convolutional neural networks, have been employed as wave function ansätze in variational Monte Carlo approaches to finding ground states of given Hamiltonians. More recently, the transformer (self-attention) artificial neural network architecture has demonstrated the capacity to capture long-range correlations. However, the computational cost of the transformer algorithm scales quadratically with the physical system size. Together with high memory requirements, this limits the transformer-based approach to modelling small quantum systems. In this work, we investigate the multilayer-perceptron Mixer (MLP-Mixer) as an efficient alternative ansatz for neural quantum states. We retrofit the Mixer into an autoregressive ansatz, allowing us to sample qubit configurations directly from the squared wavefunction. To benchmark our proposed architecture, we compare its performance on two-dimensional systems of different sizes in various phases of matter that we realize using common Hamiltonians. We show that the autoregressive Mixer finds comparable ground-state energies and exhibits favourable scalability. This suggests its computational efficiency as a viable alternative for large-scale many-body variational simulations.
| Keyword-1 | Neural Quantum State |
|---|---|
| Keyword-2 | Many-body Physics |
| Keyword-3 | Artificial Neural Network |