PAINT: Partner-Agnostic Intent-Aware Cooperative Transport with Legged Robots
Zhihao Cao, Tianxu An, Chenhao Li, Stelian Coros, Marco Hutter
TLDR
PAINT enables legged robots to cooperatively transport objects by inferring partner intent from proprioceptive feedback, without external sensors.
Key contributions
- Infers partner intent from proprioceptive feedback, eliminating external force-torque sensors.
- Hierarchical learning framework (PAINT) decouples intent understanding from robust locomotion.
- Enables compliant cooperative transport across diverse terrains, payloads, and partners.
- Scales to decentralized multi-robot transport and transfers across robot embodiments.
Why it matters
This paper addresses the challenge of collaborative transport in complex environments. PAINT offers a scalable and lightweight solution by inferring partner intent from proprioceptive feedback, eliminating the need for external sensors. This makes robot collaboration more practical and adaptable for diverse real-world applications.
Original Abstract
Collaborative transport requires robots to infer partner intent through physical interaction while maintaining stable loco-manipulation. This becomes particularly challenging in complex environments, where interaction signals are difficult to capture and model. We present PAINT, a lightweight yet efficient hierarchical learning framework for partner-agonistic intent-aware collaborative legged transport that infers partner intent directly from proprioceptive feedback. PAINT decouples intent understanding from terrain-robust locomotion: A high-level policy infers the partner interaction wrench using an intent estimator and a teacher-student training scheme, while a low-level locomotion backbone ensures robust execution. This enables lightweight deployment without external force-torque sensing or payload tracking. Extensive simulation and real-world experiments demonstrate compliant cooperative transport across diverse terrains, payloads, and partners. Furthermore, we show that PAINT naturally scales to decentralized multi-robot transport and transfers across robot embodiments by swapping the underlying locomotion backbone. Our results suggest that proprioceptive signals in payload-coupled interaction provide a scalable interface for partner-agnostic intent-aware collaborative transport.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.