From Transfer to Collaboration: A Federated Framework for Cross-Market Sequential Recommendation
Jundong Chen, Honglei Zhang, Xiangmou Qu, Haoxuan Li, Han Yu + 1 more
TLDR
FeCoSR introduces a federated collaboration framework for cross-market sequential recommendation, addressing data isolation and market heterogeneity.
Key contributions
- Introduces FeCoSR, a federated many-to-many collaboration paradigm for cross-market sequential recommendation.
- Employs federated pretraining for shared behaviors and local fine-tuning for market-specific preferences.
- Proposes Semantic Soft Cross-Entropy (S^2CE) to mitigate market heterogeneity and negative transfer.
- Designs a market-specific adaptation module to capture local item preferences during fine-tuning.
Why it matters
This paper introduces a novel federated framework that significantly improves cross-market sequential recommendations. By enabling collaborative learning across diverse markets, it overcomes limitations of traditional transfer methods like negative transfer and source degradation. This approach ensures all markets benefit, leading to more effective and personalized recommendations.
Original Abstract
Cross-market recommendation (CMR) aims to enhance recommendation performance across multiple markets. Due to its inherent characteristics, i.e., data isolation, non-overlapping users, and market heterogeneity, CMR introduces unique challenges and fundamentally differs from cross-domain recommendation (CDR). Existing CMR approaches largely inherit CDR by adopting the one-to-one transfer paradigm, where a model is pretrained on a source market and then fine-tuned on a target market. However, such a paradigm suffers from CH1. source degradation, where the source market sacrifices its own performance for the target markets, and CH2. negative transfer, where market heterogeneity leads to suboptimal performance in target markets. To address these challenges, we propose FeCoSR, a novel federated collaboration framework for cross-market sequential recommendation. Specifically, to tackle CH1, we introduce a many-to-many collaboration paradigm that enables all markets to jointly participate in and benefit from training. It consists of a federated pretraining stage for capturing shared behavior-level patterns, followed by local fine-tuning for market-specific item-level preferences. For CH2, we theoretically and empirically show that vanilla Cross-Entropy (CE) exacerbates market heterogeneity, undermining federated optimization. To address this, we propose a Semantic Soft Cross-Entropy (S^2CE) that leverages shared semantic information to facilitate collaborative behavioral learning across markets. Then, we design a market-specific adaptation module during fine-tuning to capture local item preferences. Extensive experiments on the real-world datasets demonstrate the advantages of FeCoSR over other methods.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.