ArXiv TLDR

Neuromorphic Parameter Estimation for Power Converter Health Monitoring Using Spiking Neural Networks

🐦 Tweet
2604.15714

Hyeongmeen Baik, Hamed Poursiami, Maryam Parsa, Jinia Roy

cs.NEcs.LGeess.SY

TLDR

This paper introduces a neuromorphic SNN for ultra-low-power, always-on health monitoring of power converters, achieving high accuracy and energy efficiency.

Key contributions

  • A three-layer SNN estimates passive component parameters for power converter health monitoring.
  • Differentiable ODE solver provides physics-consistent training, decoupling physics loss from SNN loop.
  • Reduces lumped resistance error to 10.2% (within tolerance) and achieves ~270x energy reduction.
  • Enables degradation tracking and event-driven fault detection via spike-rate jumps.

Why it matters

Always-on power converter health monitoring requires ultra-low-power edge inference, a challenge for traditional GPU-based methods. This paper presents an SNN solution that achieves high accuracy and significant energy savings, making it ideal for deployment on neuromorphic hardware. This enables reliable, continuous monitoring of critical systems.

Original Abstract

Always-on converter health monitoring demands sub-mW edge inference, a regime inaccessible to GPU-based physics-informed neural networks. This work separates spiking temporal processing from physics enforcement: a three-layer leaky integrate-and-fire SNN estimates passive component parameters while a differentiable ODE solver provides physics-consistent training by decoupling the ODE physics loss from the unrolled spiking loop. On an EMI-corrupted synchronous buck converter benchmark, the SNN reduces lumped resistance error from $25.8\%$ to $10.2\%$ versus a feedforward baseline, within the $\pm 10\%$ manufacturing tolerance of passive components, at a projected ${\sim}270\times$ energy reduction on neuromorphic hardware. Persistent membrane states further enable degradation tracking and event-driven fault detection via a $+5.5$ percentage-point spike-rate jump at abrupt faults. With $93\%$ spike sparsity, the architecture is suited for always-on deployment on Intel Loihi 2 or BrainChip Akida.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.