As a proof-of-concept, we show that our network can learn to classify handwritten digits. We used three classes from the MNIST dataset: classes 0 and 1 were used for training and testing, and class 2 was included in the test phase but unseen during training.

Our networks showed the best performance improvement on the training digit classes when we applied a stricter 90% output matching threshold. With an 80% threshold, many networks were unable to separate the two classes.

With the lower 80% threshold, many networks can identify a third untrained digit class with accuracies upwards of 60%. Could we use this approach for generalized learning?

Show thread

Can we train spiking neural networks with local rules that change the spike transmission speed? 🧵 New preprint: arxiv.org/abs/2211.08397 From the master thesis work of Jørgen Farner! Proud supervisor here 😊 @stenichele

We developed a new STDP-like delay learning rule for SNNs: Pre-synaptic spikes arriving at a post-synaptic neuron within a certain time window before a post-synaptic spike have their transmission delays adjusted to push future spikes closer to the average arrival time.

Is this local delay plasticity rule alone sufficient to train a network to perform a simple computational task? To answer this, we found suitable encoding and decoding methods for our approach and applied the resulting framework to a classification task.

Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.