SCNO: Spiking Compositional Neural Operator -- Towards a Neuromorphic Foundation Model for Nuclear PDE Solving
Samrendra Roy, Souvik Chakraborty, Rizwan-uddin, Syed Bahauddin Alam
TLDR
SCNO is a modular spiking neural operator that composes elementary blocks to solve complex PDEs with high accuracy and efficiency, enabling neuromorphic foundation models.
Key contributions
- Introduces SCNO, a modular spiking neural operator for solving complex partial differential equations (PDEs).
- Composes small, pre-trained blocks for elementary operators to solve unseen coupled PDEs.
- Uses a correction network for zero-forgetting expansion and learning cross-coupling residuals.
- Achieves superior accuracy (up to 65% better) and efficiency (95K vs 462K params) over monolithic baselines.
Why it matters
This paper introduces a novel modular and energy-efficient approach to solving PDEs using spiking neural networks. It addresses key limitations of current neural operators by enabling compositional problem-solving and zero-forgetting expansion. This paves the way for neuromorphic foundation models that can adapt to new physics without costly retraining.
Original Abstract
Neural operators have emerged as powerful surrogates for partial differential equation (PDE) solvers, yet they are typically trained as monolithic models for individual PDEs, require energy-intensive GPU hardware, and must be retrained from scratch when new physics emerge. We introduce the Spiking Compositional Neural Operator (SCNO), a modular architecture combining spiking and conventional components that addresses all three limitations. SCNO maintains a library of small spiking neural operator blocks, each trained on a single elementary differential operator (convection, diffusion, reaction), and composes them through a lightweight input-conditioned aggregator to solve coupled PDEs not seen during block training. A small correction network learns cross-coupling residuals while keeping all blocks and the aggregator frozen, preserving zero-forgetting modular expansion by construction. We evaluate SCNO on eight PDE families including five coupled systems and a nuclear-relevant 1-group neutron diffusion equation. SCNO with correction achieves the lowest relative $L^2$ error on four of five coupled PDEs, outperforming both a monolithic spiking DeepONet (by up to 62%, mean over 3 seeds) and a standard ANN DeepONet (by up to 65%), while requiring only 95K trainable parameters versus 462K for the monolithic baseline. To our knowledge, this is the first compositional spiking neural operator and the first proof-of-concept for modular neuromorphic PDE solving with built-in forgetting-free expansion.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.