ArXiv TLDR

Differentially Private Conformal Prediction

🐦 Tweet
2604.14621

Jiamei Wu, Ce Zhang, Zhipeng Cai, Jingsen Kong, Bei Jiang + 2 more

stat.MLcs.LG

TLDR

DPCP offers a novel, non-splitting approach to differentially private conformal prediction, achieving tighter uncertainty quantification than prior methods.

Key contributions

  • Introduces "differential CP," a non-splitting conformal procedure that avoids data splitting efficiency loss.
  • Develops "DPCP," a fully private method combining DP model training with a private quantile mechanism.
  • Establishes end-to-end privacy for DPCP and investigates its coverage properties.
  • DPCP produces tighter prediction sets than existing private split conformal approaches under the same privacy budget.

Why it matters

Uncertainty quantification with privacy is a critical challenge. This paper offers a statistically efficient solution, overcoming limitations of prior split-conformal methods. Its DPCP method provides more accurate prediction sets while maintaining strong privacy guarantees.

Original Abstract

Conformal prediction (CP) has attracted broad attention as a simple and flexible framework for uncertainty quantification through prediction sets. In this work, we study how to deploy CP under differential privacy (DP) in a statistically efficient manner. We first introduce differential CP, a non-splitting conformal procedure that avoids the efficiency loss caused by data splitting and serves as a bridge between oracle CP and private conformal inference. By exploiting the stability properties of DP mechanisms, differential CP establishes a direct connection to oracle CP and inherits corresponding validity behavior. Building on this idea, we develop Differentially Private Conformal Prediction (DPCP), a fully private procedure that combines DP model training with a private quantile mechanism for calibration. We establish the end-to-end privacy guarantee of DPCP and investigate its coverage properties under additional regularity conditions. We further study the efficiency of both differential CP and DPCP under empirical risk minimization and general regression models, showing that DPCP can produce tighter prediction sets than existing private split conformal approaches under the same privacy budget. Numerical experiments on synthetic and real datasets demonstrate the practical effectiveness of the proposed methods.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.