Graph Neural Networks in the Wilson Loop Representation of Abelian Lattice Gauge Theories
TLDR
This paper introduces a gauge-invariant Graph Neural Network (GNN) architecture for Abelian lattice gauge theories, accurately predicting observables and simulating dynamics.
Key contributions
- Introduces a novel gauge-invariant GNN architecture for Abelian lattice gauge models.
- Uses Wilson loops as inputs to explicitly enforce symmetry and eliminate redundant gauge degrees of freedom.
- Achieves accurate predictions for global and spatially resolved quantities in Z2 and U(1) lattice gauge models.
- Acts as an efficient surrogate for semiclassical dynamics in U(1) quantum link models, enabling scalable time evolution.
Why it matters
This paper presents a significant advancement in simulating complex physical systems by introducing a GNN that inherently respects gauge symmetries. This approach offers a more efficient and scalable way to model Abelian lattice gauge theories, crucial for understanding strongly correlated phases and engineered quantum dynamics.
Original Abstract
Local gauge structures play a central role in a wide range of condensed matter systems and synthetic quantum platforms, where they emerge as effective descriptions of strongly correlated phases and engineered dynamics. We introduce a gauge-invariant graph neural network (GNN) architecture for Abelian lattice gauge models, in which symmetry is enforced explicitly through local gauge-invariant inputs, such as Wilson loops, and preserved throughout message passing, eliminating redundant gauge degrees of freedom while retaining expressive power. We benchmark the approach on both $\mathbb{Z}_2$ and $\mathrm{U}(1)$ lattice gauge models, achieving accurate predictions of global observables and spatially resolved quantities despite the nonlocal correlations induced by gauge-matter coupling. We further demonstrate that the learned model serves as an efficient surrogate for semiclassical dynamics in $\mathrm{U}(1)$ quantum link models, enabling stable and scalable time evolution without repeated fermionic diagonalization, while faithfully reproducing both local dynamics and statistical correlations. These results establish gauge-invariant message passing as a compact and physically grounded framework for learning and simulating Abelian lattice gauge systems.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.