ArXiv TLDR

GaussianFlow SLAM: Monocular Gaussian Splatting SLAM Guided by GaussianFlow

🐦 Tweet
2604.15612

Dong-Uk Seo, Jinwoo Jeon, Eungchang Mason Lee, Hyun Myung

cs.ROcs.CV

TLDR

GaussianFlow SLAM improves monocular Gaussian Splatting SLAM by using optical flow to guide scene structure and camera pose optimization.

Key contributions

  • Leverages optical flow (GaussianFlow) for geometric guidance in monocular 3DGS-SLAM.
  • Aligns projected Gaussian motion with optical flow to regularize map reconstruction and pose.
  • Introduces normalized error-based densification and pruning for refining inactive/unstable Gaussians.
  • Achieves superior rendering quality and tracking accuracy compared to state-of-the-art methods.

Why it matters

Monocular Gaussian Splatting SLAM struggles with geometric cues, leading to inaccuracies. This paper addresses this by integrating optical flow, providing robust geometric supervision. It significantly enhances map quality and tracking, pushing monocular SLAM capabilities.

Original Abstract

Gaussian splatting has recently gained traction as a compelling map representation for SLAM systems, enabling dense and photo-realistic scene modeling. However, its application to monocular SLAM remains challenging due to the lack of reliable geometric cues from monocular input. Without geometric supervision, mapping or tracking could fall in local-minima, resulting in structural degeneracies and inaccuracies. To address this challenge, we propose GaussianFlow SLAM, a monocular 3DGS-SLAM that leverages optical flow as a geometry-aware cue to guide the optimization of both the scene structure and camera poses. By encouraging the projected motion of Gaussians, termed GaussianFlow, to align with the optical flow, our method introduces consistent structural cues to regularize both map reconstruction and pose estimation. Furthermore, we introduce normalized error-based densification and pruning modules to refine inactive and unstable Gaussians, thereby contributing to improved map quality and pose accuracy. Experiments conducted on public datasets demonstrate that our method achieves superior rendering quality and tracking accuracy compared with state-of-the-art algorithms. The source code is available at: https://github.com/url-kaist/gaussianflow-slam.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.