Batch Normalization for Neural Networks on Complex Domains
Xuan Son Nguyen, Nistor Grozavu
TLDR
This paper proposes batch normalization layers for neural networks on complex domains, improving stability and accuracy in various tasks.
Key contributions
- Proposes novel Batch Normalization layers specifically for neural networks operating on complex domains.
- Develops practical implementation components for less-studied complex domains, such as the Siegel disk.
- Connects the new BN layers to existing Riemannian batch normalization techniques.
- Validated efficacy across radar clutter, node classification, and action recognition tasks.
Why it matters
This paper extends the benefits of Batch Normalization to neural networks operating on complex domains, an area previously less explored. By providing practical implementations for these domains, it enhances training stability and accuracy for various real-world applications. This work broadens the applicability of robust deep learning techniques.
Original Abstract
Riemannian neural networks have proven effective in solving a variety of machine learning tasks. The key to their success lies in the development of principled Riemannian analogs of fundamental building blocks in deep neural networks (DNNs). Among those, Riemannian batch normalization (BN) layers have shown to enhance training stability and improve accuracy. In this paper, we propose BN layers for neural networks on complex domains. The proposed layers have close connections with existing Riemannian BN layers. We derive essential components for practical implementations of BN layers on some complex domains which are less studied in previous works, e.g., the Siegel disk domain. We conduct experiments on radar clutter classification, node classification, and action recognition demonstrating the efficacy of our method.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.