ArXiv TLDR

A Pragmatic Comparison of Cryptographic Computation Technologies for Machine Learning

🐦 Tweet
2605.04858

Marcus Taubert, Adam Skuta, Thomas Loruenser

cs.CR

TLDR

This paper compares secure multi-party computation (SMPC) and fully homomorphic encryption (FHE) for machine learning, benchmarking their performance.

Key contributions

  • Compares SMPC and FHE paradigms, outlining advantages, limitations, and open-source implementations.
  • Benchmarks leading software frameworks for machine learning operations and models.
  • FHE excels for regressions and simple dense networks, especially with GPUs or hybrid models.
  • SMPC demonstrates superior performance for complex models such as CNNs.

Why it matters

This paper addresses the challenge of selecting the right secure computation technology for machine learning. It provides practical guidance by comparing SMPC and FHE performance across various ML tasks, helping practitioners make informed adoption decisions.

Original Abstract

As security demands increase, the importance of secure computation technologies grows, yet these technologies can often seem overwhelming to practitioners. Furthermore, many approaches focus only on a single technology, potentially overlooking superior alternatives. This work aims to address the issue of selecting the right technology for secure computation by presenting a comparative analysis of two highly relevant cryptographic methods and their software implementations, with a particular focus on machine learning. Firstly, we provide a theoretical summary and comparison of the secure computation paradigms of secure multi-party computation (SMPC) and fully homomorphic encryption (FHE). We outline the advantages and limitations of the protocols, as well as the relevant open-source software implementations. Secondly, we present the results of extensive benchmarking of the main software frameworks identified for machine learning operations and models. Regarding the current state of the art in FHE, we observe that it outperforms SMPC for regressions. Additionally it may be faster for simple dense networks using GPUs or Hybrid Models. Conversely, SMPC showed superior performance for complex models such as CNNs. Our results should pave the way for more technology-agnostic benchmarking of secure computation technologies for machine learning, providing guidance for practitioners looking to adopt these technologies.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.