Misspecification-Averse Estimation
Isaiah Andrews, Ricky Li, Yucheng Shang
TLDR
This paper introduces a new constrained multiplier criterion for optimal estimation under likelihood misspecification, proving its asymptotic optimality.
Key contributions
- Introduces the "constrained multiplier criterion" for flexible misspecification attitudes.
- Proves a local asymptotic minimax theorem, extending efficiency bounds under misspecification.
- Characterizes optimal estimators as Bayes decision rules under an exponentially tilted likelihood.
- Shows feasible plug-in analogs of these estimators are asymptotically optimal.
Why it matters
This paper addresses optimal estimation under likelihood misspecification, a critical problem. It introduces a robust "constrained multiplier criterion" and characterizes optimal estimators, extending efficiency bounds for reliable inference.
Original Abstract
We study optimal estimation when the likelihood may be misspecified. Building on tools from the theory of decision-making under uncertainty, we analyze a class of axiomatically grounded optimality criteria which nests several existing misspecification-robust objectives. Within this class, we introduce the constrained multiplier criterion, which allows for flexible misspecification attitudes. We prove a local asymptotic minimax theorem for this criterion, extending a classical efficiency bound to a limit experiment which incorporates moment-constrained misspecification concerns. We characterize asymptotically optimal estimators as Bayes decision rules under a flat prior and an exponentially tilted likelihood that incorporates the moment constraints, and show that feasible plug-in analogs are asymptotically optimal.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.