Parallel Scan Recurrent Neural Quantum States for Scalable Variational Monte Carlo
Ejaaz Merali, Mohamed Hibat-Allah, Mohammad Kohandel, Richard T. Scalettar, Ehsan Khatami
TLDR
Introducing Parallel Scan Recurrent Neural Quantum States (PSR-NQS), this work demonstrates how RNNs can efficiently simulate large quantum many-body systems.
Key contributions
- Develops Parallel Scan Recurrent Neural Quantum States (PSR-NQS) for variational Monte Carlo.
- Leverages parallelizable recurrence to enable efficient training of recurrent neural networks.
- Achieves accurate benchmark results in one and two spatial dimensions.
- Scales to large 2D spin lattices (up to 52x52) matching quantum Monte Carlo data.
Why it matters
This work challenges the perception that recurrent neural networks are not scalable for quantum simulations. By demonstrating efficient training and high accuracy on large systems, it opens new avenues for using RNNs to model complex quantum many-body problems with accessible computational resources.
Original Abstract
Neural-network quantum states have emerged as a powerful variational framework for quantum many-body systems, with recent progress often driven by massively parallel architectures such as transformers. Recurrent neural network quantum states, however, are frequently regarded as intrinsically sequential and therefore less scalable. Here we revisit this view by showing that modern recurrent architectures can support fast, accurate, and computationally accessible neural quantum state simulations. Using autoregressive recurrent wave functions together with recent advances in parallelizable recurrence, we develop variational ansätze, called parallel scan recurrent neural quantum states (PSR-NQS), which can be trained efficiently within variational Monte Carlo in one and two spatial dimensions. We demonstrate accurate benchmark results and show that, with iterative retraining, our approach reaches two-dimensional spin lattices as large as $52\times52$ while remaining in agreement with available quantum Monte Carlo data. Our results establish recurrent architectures as a practical and promising route toward scalable neural quantum state simulations with modest computational resources.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.