ArXiv TLDR

Pliable rejection sampling

🐦 Tweet
2604.22385

Akram Erraqabi, Michal Valko, Alexandra Carpentier, Odalric-Ambrym Maillard

stat.MLcs.LG

TLDR

Pliable Rejection Sampling (PRS) learns proposals via kernel estimation, providing i.i.d. samples with high probability and guaranteed acceptance rates.

Key contributions

  • Addresses the high rejection rate limitation of traditional rejection sampling.
  • Introduces Pliable Rejection Sampling (PRS) that learns proposals via a kernel estimator.
  • Ensures high probability i.i.d. samples from the target distribution f.
  • Offers a performance guarantee on the number of accepted samples.

Why it matters

This paper significantly improves rejection sampling, a fundamental technique for difficult distributions, by addressing its high rejection rate. It offers a robust method with performance guarantees, overcoming limitations of prior adaptive approaches. This makes sampling from complex distributions more efficient and reliable.

Original Abstract

Rejection sampling is a technique for sampling from difficult distributions. However, its use is limited due to a high rejection rate. Common adaptive rejection sampling methods either work only for very specific distributions or without performance guarantees. In this paper, we present pliable rejection sampling (PRS), a new approach to rejection sampling, where we learn the sampling proposal using a kernel estimator. Since our method builds on rejection sampling, the samples obtained are with high probability i.i.d. and distributed according to f. Moreover, PRS comes with a guarantee on the number of accepted samples.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.