Routers Learn the Geometry of Their Experts: Geometric Coupling in Sparse Mixture-of-Experts
Sagi Ahrac, Noya Hochwald, Mor Geva
TLDR
This paper reveals a geometric coupling between SMoE routers and experts, explaining how routers learn effective assignment geometry and proposing a coupling-based router.
Key contributions
- Discovered 'geometric coupling' where router and expert gradients align, accumulating shared token history.
- Showed auxiliary load-balancing losses disrupt this coupling, making router directions less distinct.
- Introduced a parameter-free online K-Means router based on geometric coupling for improved load balancing.
- Demonstrated the K-Means router achieves low load imbalance with minimal perplexity increase.
Why it matters
This paper reveals a 'geometric coupling' explaining how SMoE routers learn effective assignment geometry, addressing critical training challenges. This insight is vital for designing more stable, load-balanced sparse models, exemplified by their effective K-Means router.
Original Abstract
Sparse Mixture-of-Experts (SMoE) models enable scaling language models efficiently, but training them remains challenging, as routing can collapse onto few experts and auxiliary load-balancing losses can reduce specialization. Motivated by these hurdles, we study how routing decisions in SMoEs are formed mechanistically. First, we reveal a geometric coupling between routers and their corresponding experts. For a given token, the router weights for the selected expert and the expert weights processing it receive gradients along the same input direction, differing only in scalar coefficients. Thus, matched router--expert directions accumulate the same routed token history. This theoretical coupling also appears empirically in routing dynamics. In a $1$B SMoE trained from scratch, higher router scores predict stronger expert neuron activations, showing that routing decisions are mirrored inside the selected expert. Next, we analyze the effects of auxiliary load balancing on the router--expert geometric coupling, showing that such losses break this structure by spreading input-directed gradients across router weights, making distinct router directions nearly three times more similar to each other. Last, we demonstrate the centrality of geometric coupling for effective routing with a parameter-free online K-Means router, in which each expert maintains a running average of the hidden states routed to it and tokens are assigned based on cosine similarity. Compared with auxiliary-loss and loss-free balancing, this router achieves the lowest load imbalance with only a modest perplexity increase, indicating that geometric coupling captures a substantial part of what the router learns. Overall, our results explain how routers form assignment geometry that supports an effective division of labor.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.