ArXiv TLDR

Field Theory of Data: Anomaly Detection via the Functional Renormalization Group. The 2D Ising Model as a Benchmark

🐦 Tweet
2605.11138

Riccardo Finotello, Vincent Lahoche, Parham Radpay, Dine Ousmane Samary

cond-mat.stat-mechcs.IThep-thstat.ME

TLDR

This paper proposes a field theory approach using the Functional Renormalization Group for anomaly detection in high-noise data, outperforming traditional metrics.

Key contributions

  • Links high-noise anomaly detection to non-equilibrium field theory's renormalization group flow.
  • Maps phase transition detection to an effective equilibrium field theory near a Gaussian fixed point.
  • Applies Functional Renormalization Group, identifying noise-to-signal ratio as a physical temperature.
  • Identifies critical thresholds with <4% error, outperforming standard information-theoretic metrics.

Why it matters

This work offers a novel, physically grounded framework for anomaly detection in noisy data. It provides a universal strategy for resolving structures in complex datasets near criticality, bridging statistical mechanics and statistical inference.

Original Abstract

We establish a correspondence between anomaly detection in high-noise regimes and the renormalization group flow of non-equilibrium field theories. We provide a physical grounding for this framework by proving that the detection of phase transitions in interacting non-equilibrium systems maps to the study of an effective equilibrium field theory near its Gaussian fixed point, which we identify with the universal Marchenko-Pastur distribution. Applying the Functional Renormalization Group to the two-dimensional Model A, we demonstrate that the noise-to-signal ratio acts as a physical temperature, where the signal emerges as ordered domains within a thermalized background of fluctuations. Using the exact Onsager solution as a benchmark, we show that this approach identifies critical thresholds with an error below 4%, significantly outperforming standard information-theoretic metrics such as the Kullback-Leibler divergence. Our results provide a universal strategy for resolving structures in complex datasets near criticality, bridging the gap between statistical mechanics and statistical inference.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.