ArXiv TLDR

On Bayesian Softmax-Gated Mixture-of-Experts Models

🐦 Tweet
2604.20551

Nicola Bariletto, Huy Nguyen, Nhat Ho, Alessandro Rinaldo

stat.MLcs.LG

TLDR

First systematic theoretical analysis of Bayesian softmax-gated Mixture-of-Experts models, covering density/parameter estimation and model selection.

Key contributions

  • Establish posterior contraction rates for density estimation in Bayesian MoE models.
  • Derive convergence guarantees for parameter estimation using novel Voronoi-type losses.
  • Propose and analyze two strategies for selecting the optimal number of experts in MoE models.
  • Provides the first systematic theoretical analysis of Bayesian softmax-gated Mixture-of-Experts.

Why it matters

Mixture-of-Experts models are crucial for complex input-output relationships, but their Bayesian theoretical foundations have been lacking. This work fills that gap by providing a rigorous analysis of softmax-gated MoE models. The findings offer essential insights for designing robust and effective MoE systems.

Original Abstract

Mixture-of-experts models provide a flexible framework for learning complex probabilistic input-output relationships by combining multiple expert models through an input-dependent gating mechanism. These models have become increasingly prominent in modern machine learning, yet their theoretical properties in the Bayesian framework remain largely unexplored. In this paper, we study Bayesian mixture-of-experts models, focusing on the ubiquitous softmax-based gating mechanism. Specifically, we investigate the asymptotic behavior of the posterior distribution for three fundamental statistical tasks: density estimation, parameter estimation, and model selection. First, we establish posterior contraction rates for density estimation, both in the regimes with a fixed, known number of experts and with a random learnable number of experts. We then analyze parameter estimation and derive convergence guarantees based on tailored Voronoi-type losses, which account for the complex identifiability structure of mixture-of-experts models. Finally, we propose and analyze two complementary strategies for selecting the number of experts. Taken together, these results provide one of the first systematic theoretical analyses of Bayesian mixture-of-experts models with softmax gating, and yield several theory-grounded insights for practical model design.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.