pak boosted

Developmental Plasticity-inspired Adaptive Pruning for Deep Spiking and Artificial Neural Networks
arxiv.org/abs/2211.12714

Developmental Plasticity-inspired Adaptive Pruning for Deep Spiking and Artificial Neural Networks

Developmental plasticity plays a vital role in shaping the brain's structure during ongoing learning in response to the dynamically changing environments. However, the existing network compression methods for deep artificial neural networks (ANNs) and spiking neural networks (SNNs) draw little inspiration from the brain's developmental plasticity mechanisms, thus limiting their ability to learn efficiently, rapidly, and accurately. This paper proposed a developmental plasticity-inspired adaptive pruning (DPAP) method, with inspiration from the adaptive developmental pruning of dendritic spines, synapses, and neurons according to the "use it or lose it, gradually decay" principle. The proposed DPAP model considers multiple biologically realistic mechanisms (such as dendritic spine dynamic plasticity, activity-dependent neural spiking trace, local synaptic plasticity), with the addition of an adaptive pruning strategy, so that the network structure can be dynamically optimized during learning without any pre-training and retraining. We demonstrated that the proposed DPAP method applied to deep ANNs and SNNs could learn efficient network architectures that retain only relevant important connections and neurons. Extensive comparative experiments show consistent and remarkable performance and speed boost with the extremely compressed networks on a diverse set of benchmark tasks, especially neuromorphic datasets for SNNs. This work explores how developmental plasticity enables the complex deep networks to gradually evolve into brain-like efficient and compact structures, eventually achieving state-of-the-art (SOTA) performance for biologically realistic SNNs.

arxiv.org
pak boosted

What math do neuroscientists need to know?

A highlight of #SFN2022 was Ella Batty's answer to this question. She showed off an incredible math for neuroscientists course she has developed at Harvard with open materials (ebatty.github.io/MathToolsforN) and discussed her amazing work with Neuromatch Academy (compneuro.neuromatch.io/)

Her SFN slides are here: osf.io/s94b2

#Neuroscience #GradSchool #Math

pak boosted
pak boosted

Hello all!

I'm a Postdoc at CUHK working at the application of , in Psychology.

Love to see you all at Mastodon!! Great to meet you all in the decentralized social network!!

I have three cats and two dogs :)

Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.