ArXiv TLDR

GlobalCY I: A JAX Framework for Globally Defined and Symmetry-Aware Neural Kähler Potentials

🐦 Tweet
2604.11404

Abdul Rahman

hep-thcs.LGmath.AG

TLDR

GlobalCY is a JAX framework for neural Kähler potentials on Calabi-Yau geometries, showing global invariant models outperform local ones.

Key contributions

  • Introduces GlobalCY, a JAX framework for globally defined, symmetry-aware neural Kähler potentials on Calabi-Yau manifolds.
  • Compares local, global invariant, and symmetry-aware neural Kähler potential models on hard Cefalú geometries.
  • Demonstrates globally defined invariant models significantly outperform local baselines in geometric accuracy.
  • Concludes global invariant structure is a meaningful architectural constraint for learned Kähler-potential modeling.

Why it matters

This paper addresses a critical issue where local neural models fail on complex Calabi-Yau geometries. It demonstrates that globally defined invariant models significantly improve geometric accuracy for learning Kähler potentials. This provides a robust architectural solution, crucial for theoretical physics and string theory applications.

Original Abstract

We present \emph{GlobalCY}, a JAX-based framework for globally defined and symmetry-aware neural Kähler-potential models on projective hypersurface Calabi--Yau geometries. The central problem is that local-input neural Kähler-potential models can train successfully while still failing the geometry-sensitive diagnostics that matter in hard quartic regimes, especially near singular and near-singular members of the Cefalú family. To study this, we compare three model families -- a local-input baseline, a globally defined invariant model, and a symmetry-aware global model -- on the hard Cefalú cases $λ=0.75$ and $λ=1.0$ using a fixed multi-seed protocol and a geometry-aware diagnostic suite. In this benchmark, the globally defined invariant model is the strongest overall family, outperforming the local baseline on the two clearest geometric comparison metrics, negative-eigenvalue frequency and projective-invariance drift, in both cases. The gains are strongest at $λ=0.75$, while $λ=1.0$ remains more difficult. The current symmetry-aware model improves projective-invariance drift relative to the local baseline, but does not yet surpass the plain global invariant model. These results show that global invariant structure is a meaningful architectural constraint for learned Kähler-potential modeling in hard quartic Calabi--Yau settings.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.