ArXiv TLDR

Physical Foundation Models: Fixed hardware implementations of large-scale neural networks

🐦 Tweet
2604.27911

Logan G Wright, Tianyu Wang, Tatsuhiro Onodera, Peter L. McMahon

cs.LGcs.ETcs.NE

TLDR

Physical Foundation Models (PFMs) propose fixed hardware for large neural networks, leveraging physical dynamics for extreme efficiency and scale.

Key contributions

  • Introduces Physical Foundation Models (PFMs), fixed hardware where neural networks operate via natural physical dynamics.
  • PFMs promise orders-of-magnitude improvements in energy efficiency, speed, and parameter density for AI.
  • Enables AI on power-constrained edge devices and supports models up to 10^18 parameters, far exceeding current limits.

Why it matters

Foundation models are energy-intensive; this paper proposes Physical Foundation Models (PFMs) as a radical hardware solution. PFMs could drastically cut AI's energy footprint, enable advanced AI on edge devices, and allow for models orders of magnitude larger than current ones.

Original Abstract

Foundation models are deep neural networks (such as GPT-5, Gemini~3, and Opus~4) trained on large datasets that can perform diverse downstream tasks -- text and code generation, question answering, summarization, image classification, and so on. The philosophy of foundation models is to put effort into a single, large (${\sim}10^{12}$-parameter) general-purpose model that can be adapted to many downstream tasks with no or minimal additional training. We argue that the rise of foundation models presents an opportunity for hardware engineers: in contrast to when different models were used for different tasks, it now makes sense to build special-purpose, fixed hardware implementations of neural networks, manufactured and released at the roughly 1-year cadence of major new foundation-model versions. Beyond conventional digital-electronic inference hardware with read-only weight memory, we advocate a more radical re-thinking: hardware in which the neural network is realized directly at the level of the physical design and operates via the hardware's natural physical dynamics -- \textit{Physical Foundation Models} (PFMs). PFMs could enable orders-of-magnitude advantages in energy efficiency, speed, and parameter density. For ${\sim}10^{12}$-parameter models, this would both reduce the high energy burden of AI in datacenters and enable AI in edge devices that today are power-constrained to far smaller models. PFMs could also enable inference hardware for models much larger than current ones: $10^{15}$- or even $10^{18}$-parameter PFMs seem plausible by some measures. We present back-of-the-envelope calculations illustrating PFM scaling using an optical example -- a 3D nanostructured glass medium -- and discuss prospects in nanoelectronics and other physical platforms. We conclude with the major research challenges that must be resolved for trillion-parameter PFMs and beyond to become reality.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.