ArXiv TLDR

UnIte: Uncertainty-based Iterative Document Sampling for Domain Adaptation in Information Retrieval

🐦 Tweet
2604.25142

Jongyoon Kim, Minseong Hwang, Seung-won Hwang

cs.IRcs.AI

TLDR

UnIte improves domain adaptation for neural retrievers by sampling documents based on aleatoric and epistemic uncertainty, achieving significant gains.

Key contributions

  • Introduces UnIte, an uncertainty-based iterative document sampling method for domain adaptation.
  • Filters documents with high aleatoric uncertainty to improve sample quality.
  • Prioritizes documents with high epistemic uncertainty for maximal learning utility.
  • Achieves significant nDCG@10 gains (+2.45 to +3.49) with smaller training sample sizes.

Why it matters

This paper addresses a critical challenge in unsupervised domain adaptation for neural retrievers: efficient and effective document selection for pseudo query generation. By leveraging model uncertainty, UnIte significantly improves adaptation quality and efficiency. This leads to better performance with fewer training samples, making domain adaptation more practical.

Original Abstract

Unsupervised domain adaptation generalizes neural retrievers to an unseen domain by generating pseudo queries on target domain documents. The quality and efficiency of this adaptation critically depend on which documents are selected for pseudo query generation. The existing document sampling method focuses on diversity but fails to capture model uncertainty. In contrast, we propose **Un**certainty-based **Ite**rative Document Sampling (UnIte) addressing these limitations by (1) filtering documents with high aleatoric uncertainty and (2) prioritizing those with high epistemic uncertainty, maximizing the learning utility of the current model. We conducted extensive experiments on a large corpus of BEIR with small and large models, showing significant gains of +2.45 and +3.49 nDCG@10 with a smaller training sample size, 4k on average.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.