ArXiv TLDR

Bridging Theory and Practice in Crafting Robust Spiking Reservoirs

🐦 Tweet
2604.06395

Ruggero Freddi, Nicolas Seseri, Diana Nigrisoli, Alessio Basti

cs.LGq-bio.NCstat.ML

TLDR

This paper introduces the 'robustness interval' to simplify tuning spiking reservoirs, showing how to find stable, high-performing configurations.

Key contributions

  • Introduces 'robustness interval' to measure hyperparameter range for stable spiking reservoir performance.
  • Identifies monotonic trends: robustness decreases with sparsity and increases with firing threshold.
  • Reveals iso-performance manifolds by preserving the analytical mean-field critical point ($w_{ ext{crit}}$).
  • Validates $w_{ ext{crit}}$ as a robust starting point for parameter search, consistently found in high-performance regions.

Why it matters

Tuning spiking reservoirs for optimal, energy-efficient performance is challenging. This paper introduces the 'robustness interval' and identifies hyperparameter trends, simplifying stable configuration search. It validates a theoretical critical point as a robust starting coordinate, making temporal processing more accessible.

Original Abstract

Spiking reservoir computing provides an energy-efficient approach to temporal processing, but reliably tuning reservoirs to operate at the edge-of-chaos is challenging due to experimental uncertainty. This work bridges abstract notions of criticality and practical stability by introducing and exploiting the robustness interval, an operational measure of the hyperparameter range over which a reservoir maintains performance above task-dependent thresholds. Through systematic evaluations of Leaky Integrate-and-Fire (LIF) architectures on both static (MNIST) and temporal (synthetic Ball Trajectories) tasks, we identify consistent monotonic trends in the robustness interval across a broad spectrum of network configurations: the robustness-interval width decreases with presynaptic connection density $β$ (i.e., directly with sparsity) and directly with the firing threshold $θ$. We further identify specific $(β, θ)$ pairs that preserve the analytical mean-field critical point $w_{\text{crit}}$, revealing iso-performance manifolds in the hyperparameter space. Control experiments on Erdős-Rényi graphs show the phenomena persist beyond small-world topologies. Finally, our results show that $w_{\text{crit}}$ consistently falls within empirical high-performance regions, validating $w_{\text{crit}}$ as a robust starting coordinate for parameter search and fine-tuning. To ensure reproducibility, the full Python code is publicly available.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.