Query Lower Bounds for Diffusion Sampling
TLDR
This paper establishes the first information-theoretic lower bounds for score queries in diffusion sampling, showing a $\widetildeΩ(\sqrt{d})$ requirement.
Key contributions
- Establishes the first information-theoretic score query lower bounds for diffusion sampling.
- Proves $\widetildeΩ(\sqrt{d})$ adaptive score queries are required for $d$-dimensional distributions.
- Demonstrates samplers must explore $\widetildeΩ(\sqrt{d})$ distinct noise levels.
- Formally explains why multiscale noise schedules are essential in practice.
Why it matters
This paper clarifies the fundamental limits of accelerating diffusion sampling. It provides theoretical grounding for practical design choices like multiscale noise schedules, guiding future research in efficient generative models.
Original Abstract
Diffusion models generate samples by iteratively querying learned score estimates. A rapidly growing literature focuses on accelerating sampling by minimizing the number of score evaluations, yet the information-theoretic limits of such acceleration remain unclear. In this work, we establish the first score query lower bounds for diffusion sampling. We prove that for $d$-dimensional distributions, given access to score estimates with polynomial accuracy $\varepsilon=d^{-O(1)}$ (in any $L^p$ sense), any sampling algorithm requires $\widetildeΩ(\sqrt{d})$ adaptive score queries. In particular, our proof shows that any sampler must search over $\widetildeΩ(\sqrt{d})$ distinct noise levels, providing a formal explanation for why multiscale noise schedules are necessary in practice.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.