Efficient Diffusion Models under Nonconvex Equality and Inequality constraints via Landing
Kijung Jeon, Michael Muehlebach, Molei Tao
TLDR
Presents an efficient framework for constrained diffusion models on nonconvex sets, using a novel landing mechanism for equality and inequality constraints.
Key contributions
- Unified framework for constrained diffusion on generic nonconvex feasible sets.
- Introduces a computationally efficient "landing mechanism" replacing costly projections.
- Enforces both equality and inequality constraints throughout the diffusion process.
- Leverages underdamped dynamics to accelerate mixing and reduce simulation costs.
Why it matters
Constrained generative modeling is vital for scientific and engineering tasks. This paper provides a practical, scalable, and efficient solution for diffusion on complex nonconvex sets, significantly reducing computational costs while preserving sample quality. This broadens the applicability of constrained diffusion.
Original Abstract
Generative modeling within constrained sets is essential for scientific and engineering applications involving physical, geometric, or safety requirements (e.g., molecular generation, robotics). We present a unified framework for constrained diffusion models on generic nonconvex feasible sets $Σ$ that simultaneously enforces equality and inequality constraints throughout the diffusion process. Our framework incorporates both overdamped and underdamped dynamics for forward and backward sampling. A key algorithmic innovation is a computationally efficient landing mechanism that replaces costly and often ill-defined projections onto $Σ$, ensuring feasibility without iterative Newton solves or projection failures. By leveraging underdamped dynamics, we accelerate mixing toward the prior distribution, effectively alleviating the high simulation costs typically associated with constrained diffusion. Empirically, this approach reduces function evaluations and memory usage during both training and inference while preserving sample quality. On benchmarks featuring equality and mixed constraints, our method achieves comparable sample quality to state-of-the-art baselines while significantly reducing computational cost, providing a practical and scalable solution for diffusion on nonconvex feasible sets.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.