ArXiv TLDR

Distributionally Robust Receive Combining

🐦 Tweet
2401.12345

Shixiong Wang, Wei Dai, Geoffrey Ye Li

eess.SP

TLDR

This paper proposes a distributionally robust receive combining framework for wireless signal estimation, handling various uncertainties without explicit channel estimation.

Key contributions

  • Proposes a distributionally robust receive combining framework for wireless signal estimation.
  • Handles various uncertainties (e.g., signal covariance, channel noise) without requiring explicit channel estimation.
  • Unifies existing linear combiners (diagonal loading) and extends to nonlinear estimators in RKHS/NN spaces.
  • Proves ridge and kernel ridge regression are distributionally robust against feature covariance perturbations.

Why it matters

This framework offers a robust approach to wireless signal estimation, crucial for integrated sensing and communication systems. It simplifies design and improves performance by handling uncertainties without explicit channel estimation. The work also bridges communication theory and statistical learning, proving robustness of methods like ridge regression.

Original Abstract

This article investigates signal estimation in wireless transmission (i.e., receive combining) from the perspective of statistical machine learning, where the transmit signals may be from an integrated sensing and communication system; that is, 1) signals may be not only discrete constellation points but also arbitrary complex values; 2) signals may be spatially correlated. Particular attention is paid to handling various uncertainties such as the uncertainty of the transmit signal covariance, the uncertainty of the channel matrix, the uncertainty of the channel noise covariance, the existence of channel impulse noises, the non-ideality of the power amplifiers, and the limited sample size of pilots. To proceed, a distributionally robust receive combining framework that is insensitive to the above uncertainties is proposed, which reveals that channel estimation is not a necessary operation. For optimal linear estimation, the proposed framework includes several existing combiners as special cases such as diagonal loading and eigenvalue thresholding. For optimal nonlinear estimation, estimators are limited in reproducing kernel Hilbert spaces and neural network function spaces, and corresponding uncertainty-aware solutions (e.g., kernelized diagonal loading) are derived. In addition, we prove that the ridge and kernel ridge regression methods in machine learning are distributionally robust against diagonal perturbation in feature covariance.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.