Direct-to-Event Spiking Neural Network Transfer
Nhan Trong Luu, Duong Trung Luu, Pham Ngoc Nam, Truong Cong Thang
TLDR
This paper investigates converting direct-coded Spiking Neural Networks to more energy-efficient event-based representations while preserving performance.
Key contributions
- Systematically investigates the transfer problem from direct-coded to event-based SNNs.
- Analyzes key challenges in converting continuous-valued SNNs to event-based representations.
- Proposes methods to enable energy-efficient SNN transfer while preserving model performance.
Why it matters
Direct-coded SNNs are less energy-efficient than event-based ones, limiting their practical use. This research enables converting existing SNNs to event-based representations, promoting reusability and unlocking low-power neuromorphic deployment.
Original Abstract
Spiking Neural Networks (SNNs) have gained increasing attention due to their potential for low-power computation on neuromorphic hardware. A widely adopted training strategy for SNNs is direct coding, which enable backpropagation on neuron implementations using continuous-valued surrogate activations. However, recent studies have shown that direct-coded SNNs remain substantially less energy-efficient than their event-based counterparts, limiting their practical deployment in energy sensitive scenarios. Still, to promote the reusability of pretrained SNN database on direct code, this motivates an important yet underexplored question: How can a SNN pretrained with direct code be effectively converted into an event-based representation? In this research, we present the first systematic investigation into this transfer problem, analyze the key challenges that arise when transitioning from direct-coded to event-based computation and propose a set of methods to enable energy-efficient transfer while preserving model performance.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.