ArXiv TLDR

Gromov-Wasserstein Methods for Multi-View Relational Embedding and Clustering

🐦 Tweet
2604.23912

Rafael Pereira Eufrazio, Eduardo Fernandes Montesuma, Charles Casimiro Cavalcante

cs.LGstat.ML

TLDR

This paper introduces Gromov-Wasserstein methods (Bary-GWMDS, Mean-GWMDS-C) for multi-view relational embedding and clustering, handling geometric differences.

Key contributions

  • Proposes Bary-GWMDS, a Gromov-Wasserstein method for consensus embedding from multi-view distance matrices.
  • Naturally handles nonlinear geometric distortions across views by leveraging intrinsic distances.
  • Introduces Mean-GWMDS-C for clustering, averaging distance matrices for reduced-support representations.
  • Yields stable and geometrically meaningful embeddings on synthetic and real-world datasets.

Why it matters

Learning from multi-view data with differing geometries is challenging. This paper offers robust Gromov-Wasserstein methods to learn shared relational structures and perform clustering, yielding stable and geometrically meaningful embeddings. This advances multi-view data analysis.

Original Abstract

Learning low-dimensional representations from multi-view relational data is challenging when underlying geometries differ across views. We propose Bary-GWMDS, a Gromov-Wasserstein-based method that operates directly on distance matrices to learn a consensus embedding preserving shared relational structure. By leveraging intrinsic distances, the approach naturally handles nonlinear distortions across views. We also introduce Mean-GWMDS-C, a clustering-oriented formulation that averages distance matrices and learns reduced-support representations via a consensus Gromov-Wasserstein transport. Experiments on synthetic and real-world datasets show that the proposed framework yields stable and geometrically meaningful embeddings.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.