Show Me the Infographic I Imagine: Intent-Aware Infographic Retrieval for Authoring Support
Jing Xu, Jiarui Hu, Zhihao Shuai, Yiyun Chen, Weikai Yang
TLDR
This paper introduces an intent-aware framework for retrieving infographics, helping users create them more easily by aligning queries with visual designs.
Key contributions
- Develops an intent-aware infographic retrieval framework.
- Conducts a formative study to derive an infographic intent taxonomy.
- Leverages the taxonomy to enrich user queries, guiding retrieval with intent-specific cues.
- Supports high-level edit intents with an interactive agent for design adaptation.
Why it matters
Authoring infographics is challenging, especially for novices. This framework significantly lowers the barrier by providing relevant design inspiration. It improves retrieval quality and supports efficient authoring by better understanding user intent.
Original Abstract
While infographics have become a powerful medium for communicating data-driven stories, authoring them from scratch remains challenging, especially for novice users. Retrieving relevant exemplars from a large corpus can provide design inspiration and promote reuse, substantially lowering the barrier to infographic authoring. However, effective retrieval is difficult because users often express design intent in ambiguous natural language, while infographics embody rich and multi-faceted visual designs. As a result, keyword-based search often fails to capture design intent, and general-purpose vision-language retrieval models trained on natural images are ill-suited to the text-heavy, multi-component nature of infographics. To address these challenges, we develop an intent-aware infographic retrieval framework that better aligns user queries with infographic designs. We first conduct a formative study of how people describe infographics and derive an intent taxonomy spanning content and visual design facets. This taxonomy is then leveraged to enrich and refine free-form user queries, guiding the retrieval process with intent-specific cues. Building on the retrieved exemplars, users can adapt the designs to their own data with high-level edit intents, supported by an interactive agent that performs low-level adaptation. Both quantitative evaluations and user studies are conducted to demonstrate that our method improves retrieval quality over baseline methods while better supporting intent satisfaction and efficient infographic authoring.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.