ArXiv TLDR

Simulating Infant First-Person Sensorimotor Experience via Motion Retargeting from Babies to Humanoids

🐦 Tweet
2604.27583

Francisco M. López, Hoshinori Kanazawa, Ondrej Fiala, Yakov Balashov, Valentin Marcel + 6 more

q-bio.NCcs.RO

TLDR

This paper introduces a framework to simulate infant first-person sensorimotor experiences by retargeting their motion from videos onto humanoids.

Key contributions

  • Reconstructs infant 3D pose and skeletal structure from a single video.
  • Retargets infant motion onto physical (iCub) and virtual humanoid platforms.
  • Simulates multimodal sensorimotor streams: proprioception, touch, and vision.
  • Achieves sub-centimeter accuracy, enabling rich multimodal analysis of infant development.

Why it matters

This framework offers a unique window into infant sensorimotor experience, providing new tools for robotics and developmental science. It also aids in the early detection of neurodevelopmental disorders.

Original Abstract

Motion retargeting from humans to human-like artificial agents is becoming increasingly important as humanoid robots grow more capable. However, most existing approaches focus only on reproducing kinematics and ignore the rich sensorimotor experience associated with human movement. In this work, we present a framework for simulating the multimodal sensorimotor experiences of infants using physical and virtual humanoids. From a single video, our method reconstructs the infant's body configuration by extracting its skeletal structure and estimating the full 3D pose from each frame. Then we map the reconstructed motion onto several developmental platforms: the physical iCub robot and the virtual simulators pyCub, EMFANT and MIMo. Replaying the retargeted motions on these embodiments produces simulated multisensory streams including proprioception (joints and muscles), touch, and vision. For the best-matching embodiment, the retargeting achieves sub-centimeter accuracy and enables a rich multimodal analysis of infant development as well as enhanced automated annotation of behaviors. This framework provides a unique window into the infant's sensorimotor experience, offering new tools for robotics, developmental science, and early detection of neurodevelopmental disorders. The code is available at https://github.com/ctu-vras/motion-retargeting/.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.