Kuramoto Oscillatory Phase Encoding: Neuro-inspired Synchronization for Improved Learning Efficiency
Mingqing Xiao, Yansen Wang, Dongqi Han, Caihua Shan, Dongsheng Li
TLDR
KoPE integrates neuro-inspired oscillatory phase encoding into Vision Transformers, boosting learning efficiency via synchronization-enhanced structure learning.
Key contributions
- Integrates Kuramoto oscillatory Phase Encoding (KoPE) into Vision Transformers for neuro-inspired synchronization.
- Boosts training, parameter, and data efficiency through synchronization-enhanced structure learning.
- Improves performance on tasks requiring structured understanding, like segmentation and abstract reasoning.
Why it matters
This paper addresses a gap in deep learning by incorporating biological oscillatory synchronization. It offers a scalable, neuro-inspired method to significantly improve the efficiency and structured understanding capabilities of state-of-the-art vision models.
Original Abstract
Spatiotemporal neural dynamics and oscillatory synchronization are widely implicated in biological information processing and have been hypothesized to support flexible coordination such as feature binding. By contrast, most deep learning architectures represent and propagate information through activation values, neglecting the joint dynamics of rate and phase. In this work, we introduce Kuramoto oscillatory Phase Encoding (KoPE) as an additional, evolving phase state to Vision Transformers, incorporating a neuro-inspired synchronization mechanism to advance learning efficiency. We show that KoPE can improve training, parameter, and data efficiency of vision models through synchronization-enhanced structure learning. Moreover, KoPE benefits tasks requiring structured understanding, including semantic and panoptic segmentation, representation alignment with language, and few-shot abstract visual reasoning (ARC-AGI). Theoretical analysis and empirical verification further suggest that KoPE can accelerate attention concentration for learning efficiency. These results indicate that synchronization can serve as a scalable, neuro-inspired mechanism for advancing state-of-the-art neural network models.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.