OmniRobotHome: A Multi-Camera Platform for Real-Time Multiadic Human-Robot Interaction
Junyoung Lee, Sookwan Han, Jeonghwan Kim, Inhee Lee, Mingi Choi + 3 more
TLDR
OmniRobotHome is a new multi-camera platform enabling real-time, occlusion-robust 3D tracking for multiadic human-robot interaction in home environments.
Key contributions
- First room-scale platform for multiadic human-robot interaction in natural home environments.
- Utilizes 48 synchronized RGB cameras for markerless, occlusion-robust 3D tracking of humans and objects.
- Enables coordinated multi-robot actuation with two Franka arms based on real-time scene state.
- Supports long-horizon human behavior modeling via accumulated interaction trajectories.
Why it matters
This platform addresses the critical bottleneck of real-time, occlusion-robust 3D tracking in complex multiadic human-robot interaction. It makes the previously underexplored regime of multiple humans and robots sharing a workspace experimentally tractable, improving safety and anticipatory assistance.
Original Abstract
Human-robot collaboration has been studied primarily in dyadic or sequential settings. However, real homes require multiadic collaboration, where multiple humans and robots share a workspace, acting concurrently on interleaved subtasks with tight spatial and temporal coupling. This regime remains underexplored because close-proximity interaction between humans, robots, and objects creates persistent occlusion and rapid state changes, making reliable real-time 3D tracking the central bottleneck. No existing platform provides the real-time, occlusion-robust, room-scale perception needed to make this regime experimentally tractable. We present OmniRobotHome, the first room-scale residential platform that unifies wide-area real-time 3D human and object perception with coordinated multi-robot actuation in a shared world frame. The system instruments a natural home environment with 48 hardware-synchronized RGB cameras for markerless, occlusion-robust tracking of multiple humans and objects, temporally aligned with two Franka arms that act on live scene state. Continuous capture within this consistent frame further supports long-horizon human behavior modeling from accumulated trajectories. The platform makes the multiadic collaboration regime experimentally tractable. We focus on two central problems: safety in shared human-robot environments and human-anticipatory robotic assistance, and show that real-time perception and accumulated behavior memory each yield measurable gains in both.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.