ArXiv TLDR

Solution of a large nonlinear recurrent neural network at fixed connectivity

🐦 Tweet
2604.24141

Albert J. Wakhloo

cond-mat.dis-nnq-bio.NC

TLDR

This paper calculates moments and response functions of large nonlinear recurrent neural networks without averaging synaptic weights.

Key contributions

  • Calculates moments and response functions for large nonlinear recurrent neural networks.
  • Develops an approach that does not require averaging over synaptic weights.
  • Provides the first nontrivial term in a $1/\sqrt{N}$ expansion of correlation functions.
  • Establishes an analytical link between connectivity, activity correlations, and network response.

Why it matters

This paper offers a novel analytical method for understanding large nonlinear recurrent neural networks. It establishes crucial links between network connectivity, spontaneous activity, and responses to perturbations, advancing our theoretical understanding of brain-inspired models.

Original Abstract

We calculate the moments and response functions of a nonlinear random recurrent neural network in the large $N$ limit. Our approach does not require averaging over synaptic weights and gives the first nontrivial term in a $1/\sqrt{N}$ expansion of general intensive-order correlation functions, proving a recent conjecture by Shen and Hu as a special case. Our results provide an analytical link between synaptic connectivity, correlations in spontaneous activity, and the response of a network to small perturbations.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.