ArXiv TLDR

Min-Max Optimization Requires Exponentially Many Queries

🐦 Tweet
2605.13806

Martino Bernasconi, Matteo Castiglioni, Andrea Celli, Alexandros Hollender

cs.DScs.CCcs.GTcs.LGmath.OC

TLDR

Min-max optimization for nonconvex-nonconcave functions demands exponentially many queries to find an approximate stationary point.

Key contributions

  • Studies query complexity for min-max optimization of nonconvex-nonconcave functions.
  • Demonstrates that finding an $\varepsilon$-approximate stationary point is hard.
  • Proves query complexity is exponential in $1/\varepsilon$ or problem dimension $d$.

Why it matters

This research reveals a fundamental computational bottleneck in min-max optimization for nonconvex-nonconcave settings. It implies that current algorithms may face inherent scalability challenges, guiding future research towards more efficient approaches or problem reformulations.

Original Abstract

We study the query complexity of min-max optimization of a nonconvex-nonconcave function $f$ over $[0,1]^d \times [0,1]^d$. We show that, given oracle access to $f$ and to its gradient $\nabla f$, any algorithm that finds an $\varepsilon$-approximate stationary point must make a number of queries that is exponential in $1/\varepsilon$ or $d$.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.