ArXiv TLDR

Touching Space: Accessible Map Exploration Through Conversational Audio-Haptic Interaction

🐦 Tweet
2604.14637

Li Liu, Jiaming Qu, Marc Jowell Bagaoisan, David T. Lee, Leilani H. Gilpin

cs.HC

TLDR

Touching Space enables Blind and Low-Vision users to build cognitive maps for pre-travel planning via conversational audio-haptic interaction.

Key contributions

  • Enables Blind and Low-Vision (BLV) users to build cognitive maps for pre-travel planning.
  • Combines haptic and audio feedback for intuitive spatial layout exploration.
  • Integrates a conversational agent for spoken questions during map exploration.
  • Supports cognitive map construction on commodity hardware for BLV individuals.

Why it matters

This paper addresses a critical gap in assistive navigation by enabling Blind and Low-Vision individuals to build cognitive maps before travel. It enhances independence and planning for unfamiliar environments, moving beyond real-time guidance to proactive spatial understanding.

Original Abstract

Most existing assistive navigation tools focus on providing real-time guidance for Blind and Low-Vision (BLV) people, but few support building a holistic spatial understanding of unfamiliar environments before travel. Such cognitive map construction (e.g., knowing that a fountain is south of a tower and west of a hotel) is important for pre-travel planning, yet remains underexplored in prior work. To address this gap, we present Touching Space, an end-to-end system that retrieves map data for a target place and loads it into a frontend interface for exploration. The system combines haptic and audio feedback: users explore spatial layouts through touch and ask spoken questions to a conversational agent during exploration. Touching Space contributes a conversational interface that supports BLV users in building cognitive maps on commodity hardware.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.