ArXiv TLDR

Modeling Behavioral Intensity and Transitions for Generative Recommendation

🐦 Tweet
2604.24472

Wenxuan Yang, Xiaoyang Xu, Hanyu Zhang, Zhexuan Xu, Wanqiang Xiong + 1 more

cs.IRcs.AIcs.LG

TLDR

BITRec is a new generative multi-behavior recommendation framework that models behavioral intensity and transitions for improved user conversion prediction.

Key contributions

  • Proposes BITRec, a novel generative multi-behavior recommendation framework.
  • Models behavioral intensity via Hierarchical Behavior Aggregation (HBA).
  • Encodes transition patterns using Transition Relation Encoding (TRE).
  • Achieves 15-23% performance gains on four large-scale datasets.

Why it matters

Current generative recommendation models overlook behavioral intensity and transition patterns. BITRec addresses this by introducing structured behavioral modeling, leading to significant performance gains in predicting user conversions. This makes generative recommendation more effective and nuanced.

Original Abstract

Multi-behavior recommendation aims to predict user conversions by modeling various interaction types that carry distinct intent signals. Recently, generative sequence modeling methods have emerged as an important paradigm for multi-behavior recommendation by achieving flexible sequence generation. However, existing generative methods typically treat behaviors as auxiliary token features and feed them into unified attention mechanisms. These models implicitly assume uniform activation of dependencies among historical behaviors, thereby failing to discern differences in intensity or capture transition patterns. To address these limitations, we propose BITRec, a novel generative multi-behavior recommendation framework that introduces structured behavioral modeling through selective dependency activation. BITRec incorporates (i) Hierarchical Behavior Aggregation (HBA), which explicitly models behavioral intensity differences through separated exploration and commitment pathways, and (ii) Transition Relation Encoding (TRE), which encodes transition structures through explicit learnable relation matrices. Experiments on four large-scale datasets (RetailRocket, Taobao, Tmall, Insurance Dataset) with millions of interactions achieve consistent improvements of 15-23% across multiple metrics, with peak gains of 22.79% MRR on Tmall and 17.83% HR@10, 17.55% NDCG@10 on Taobao.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.