This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Brain-inspired chaotic spiking backpropagation

Brain-inspired chaotic spiking backpropagation
In contrast to the common surrogate gradient scheme in (a), the chaotic spiking backpropagation (CSBP) algorithm introduces an additional brain-inspired loss function, losschaos , which comes from the output of each neuron and generates global chaotic dynamics. Credit: Science China Press

Since it was discovered in the 1980s that learning in the rabbit brain utilizes chaos, this nonlinear and initially value-sensitive dynamical behavior has been increasingly recognized as integral to brain learning.

However, modern learning algorithms for , particularly spiking neural networks (SNNs) that closely resemble the brain, have not effectively capitalized on this feature.

Zijian Wang and Peng Tao, together with lab director Luonan Chen, endeavored to introduce intrinsic chaotic dynamics in the brain into the learning algorithms of existing SNNs. They found that this could be accomplished by merely integrating a loss function analogous to cross-entropy (see image below).

Furthermore, they observed that an SNN equipped with chaotic dynamics not only enhances learning/optimization performance but also improves the generalization performance on both neuromorphic datasets (e.g., DVS-CIFAR10 and DVS-Gesture) and large-scale static datasets (e.g., CIFAR100 and ImageNet), with the help of the ergodic and pseudo-random properties of chaos.

The team also experimented with the introduction of extrinsic chaos, such as by using Logistic maps. However, this did not enhance the learning performance of SNNs. "This is an exciting result, and it implies that the intrinsic that the brain has evolved over billions of years plays important roles in its learning efficiency," Luonan Chen says.

Although SNNs have stronger spatio-temporal signal characterization and higher energy utilization efficiency, their performance often lags behind that of traditional neural networks of equivalent size due to the lack of efficient . This new algorithm effectively bridges this gap. Moreover, since only one additional loss function needs to be introduced, it can be employed as a generalized plug-in unit with existing SNN learning methodologies.

The study is published in the journal National Science Review.

More information: Zijian Wang et al, Brain-inspired chaotic spiking backpropagation, National Science Review (2024). DOI: 10.1093/nsr/nwae037

Journal information: Chaos
Citation: Brain-inspired chaotic spiking backpropagation (2024, March 29) retrieved 1 May 2024 from https://techxplore.com/news/2024-03-brain-chaotic-spiking-backpropagation.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Mesoscale neural plasticity helps in AI learning

27 shares

Feedback to editors