ArXiv TLDR

Gating Enables Curvature: A Geometric Expressivity Gap in Attention

🐦 Tweet
2604.14702

Satwik Bathula, Anand A. Joshi

cs.LGstat.ML

TLDR

Gating in attention mechanisms enables non-flat, curved geometric representations, creating an expressivity gap over ungated attention, especially for non-linear tasks.

Key contributions

  • Ungated attention is restricted to intrinsically flat statistical manifolds.
  • Multiplicative gating enables non-flat, positively curved geometric representations.
  • Establishes a geometric expressivity gap between gated and ungated attention.
  • Gated models excel on non-linear tasks, showing higher representation curvature.

Why it matters

This paper provides a crucial geometric explanation for the empirical success of gated attention in LLMs. It reveals that gating allows attention to model complex, non-flat data manifolds, addressing a fundamental expressivity limitation of ungated attention. This understanding can guide future architectural designs.

Original Abstract

Multiplicative gating is widely used in neural architectures and has recently been applied to attention layers to improve performance and training stability in large language models. Despite the success of gated attention, the mathematical implications of gated attention mechanisms remain poorly understood. We study attention through the geometry of its representations by modeling outputs as mean parameters of Gaussian distributions and analyzing the induced Fisher--Rao geometry. We show that ungated attention operator is restricted to intrinsically flat statistical manifolds due to its affine structure, while multiplicative gating enables non-flat geometries, including positively curved manifolds that are unattainable in the ungated setting. These results establish a geometric expressivity gap between ungated and gated attention. Empirically, we show that gated models exhibit higher representation curvature and improved performance on tasks requiring nonlinear decision boundaries whereas they provide no consistent advantage on tasks with linear decision boundaries. Furthermore, we identify a structured regime in which curvature accumulates under composition, yielding a systematic depth amplification effect.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.