ArXiv TLDR

A Functorial Formulation of Neighborhood Aggregating Deep Learning

🐦 Tweet
2604.24672

Sun Woo Park, Yun Young Choi, U Jin Choi, Youngho Woo

cs.LGmath.AT

TLDR

This paper offers a functorial, presheaf-based mathematical framework for CNNs/MPNNs, explaining their empirical limitations.

Key contributions

  • Interprets CNNs/MPNNs mathematically using presheaves and copresheaves over topological spaces.
  • Develops a theoretical heuristic to explain empirical limitations of these deep learning models.
  • Identifies limitations by analyzing obstructions for continuous functions to be sheaves/copresheaves.

Why it matters

This paper provides a rigorous mathematical foundation for understanding the behavior and shortcomings of widely used deep learning architectures like CNNs and MPNNs. By using advanced topological concepts, it offers a new lens to analyze and potentially overcome their empirical limitations, guiding future architectural designs.

Original Abstract

We provide a mathematical interpretation of convolutional (or message passing) neural networks by using presheaves and copresheaves of the set of continuous functions over a topological space. Based on this interpretation, we formulate a theoretical heuristic which elaborates a number of empirical limitations of these neural networks by using obstructions on such sets of continuous functions over a topological space to be sheaves or copresheaves.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.