ArXiv TLDR

Duality for the Adversarial Total Variation

🐦 Tweet
2604.18540

Leon Bungert, Lucas Schmitt

math.APcs.LGmath.FAmath.OC

TLDR

This paper establishes duality for the adversarial total variation, characterizing its subdifferential for robust binary classification.

Key contributions

  • Characterizes the subdifferential of nonlocal total variation in adversarial training.
  • Establishes a dual representation for the nonlocal total variation.
  • Derives an integration by parts formula using nonlocal gradient and divergence.
  • Applies duality to continuous functions on metric spaces and bounded functions on Euclidean domains.

Why it matters

This work provides a rigorous mathematical foundation for adversarial training by deeply analyzing the nonlocal total variation. Understanding its subdifferential and duality can lead to more robust and theoretically sound binary classifiers.

Original Abstract

Adversarial training of binary classifiers can be reformulated as regularized risk minimization involving a nonlocal total variation. Building on this perspective, we establish a characterization of the subdifferential of this total variation using duality techniques. To achieve this, we derive a dual representation of the nonlocal total variation and a related integration of parts formula, involving a nonlocal gradient and divergence. We provide such duality statements both in the space of continuous functions vanishing at infinity on proper metric spaces and for the space of essentially bounded functions on Euclidean domains. Furthermore, under some additional conditions we provide characterizations of the subdifferential in these settings.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.