Frictionless Love: Associations Between AI Companion Roles and Behavioral Addiction
Vibhor Agarwal, Ke Zhou, Edyta Paulina Bogucka, Daniele Quercia
TLDR
This study reveals how different AI companion roles shape user interactions, perceived benefits, harms, and links to behavioral addiction.
Key contributions
- Analyzed 248K Reddit posts to understand AI companion interactions.
- Identified 10 distinct AI companion roles, like 'soulmate' or 'coach'.
- Found roles shape interaction styles, perceived benefits, and harms.
- Linked 'coach' and 'guardian' roles to higher behavioral addiction risks.
Why it matters
This paper highlights the critical ethical implications of metaphorical roles in AI companion design. Understanding these role-dependent risks is crucial for developing responsible AI that mitigates potential behavioral addiction and user harm.
Original Abstract
AI companion chatbots increasingly shape how people seek social and emotional connection, sometimes substituting for relationships with romantic partners, friends, teachers, or even therapists. When these systems adopt those metaphorical roles, they are not neutral: such roles structure people's ways of interacting, distribute perceived AI harms and benefits, and may reflect behavioral addiction signs. Yet these role-dependent risks remain poorly understood. We analyze 248,830 posts from seven prominent Reddit communities describing interactions with AI companions. We identify ten recurring metaphorical roles (for example, soulmate, philosopher, and coach) and show that each role supports distinct ways of interacting. We then extract the perceived AI harms and AI benefits associated with these role-specific interactions and link them to behavioral addiction signs, all of which has been inferred from the text in the posts. AI soulmate companions are associated with romance-centered ways of interacting, offering emotional support but also introducing emotional manipulation and distress, culminating in strong attachment. In contrast, AI coach and guardian companions are associated with practical benefits such as personal growth and task support, yet are nonetheless more frequently associated with behavioral addiction signs such as daily life disruptions and damage to offline relationships. These findings show that metaphorical roles are a central ethical design concern for responsible AI companions.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.