Abstract:Synaptic delays play a crucial role in biological neuronal networks, where their modulation has been observed in mammalian learning processes. In the realm of neuromorphic computing, although spiking neural networks (SNNs) aim to emulate biology more closely than traditional artificial neural networks do, synaptic delays are rarely incorporated into their simulation. We introduce a novel learning rule for simultaneously learning synaptic connection strengths and delays, by extending spike-timing dependent plasticity (STDP), a Hebbian method commonly used for learning synaptic weights. We validate our approach by extending a widely-used SNN model for classification trained with unsupervised learning. Then we demonstrate the effectiveness of our new method by comparing it against another existing methods for co-learning synaptic weights and delays as well as against STDP without synaptic delays. Results demonstrate that our proposed method consistently achieves superior performance across a variety of test scenarios. Furthermore, our experimental results yield insight into the interplay between synaptic efficacy and delay.
Abstract:This paper introduces Inferno, a software library built on top of PyTorch that is designed to meet distinctive challenges of using spiking neural networks (SNNs) for machine learning tasks. We describe the architecture of Inferno and key differentiators that make it uniquely well-suited to these tasks. We show how Inferno supports trainable heterogeneous delays on both CPUs and GPUs, and how Inferno enables a "write once, apply everywhere" development methodology for novel models and techniques. We compare Inferno's performance to BindsNET, a library aimed at machine learning with SNNs, and Brian2/Brian2CUDA which is popular in neuroscience. Among several examples, we show how the design decisions made by Inferno facilitate easily implementing the new methods of Nadafian and Ganjtabesh in delay learning with spike-timing dependent plasticity.