Alert button
Picture for Meng-Hiot Lim

Meng-Hiot Lim

Alert button

Membrane Potential Distribution Adjustment and Parametric Surrogate Gradient in Spiking Neural Networks

Apr 26, 2023
Siqi Wang, Tee Hiang Cheng, Meng-Hiot Lim

Figure 1 for Membrane Potential Distribution Adjustment and Parametric Surrogate Gradient in Spiking Neural Networks
Figure 2 for Membrane Potential Distribution Adjustment and Parametric Surrogate Gradient in Spiking Neural Networks
Figure 3 for Membrane Potential Distribution Adjustment and Parametric Surrogate Gradient in Spiking Neural Networks
Figure 4 for Membrane Potential Distribution Adjustment and Parametric Surrogate Gradient in Spiking Neural Networks

As an emerging network model, spiking neural networks (SNNs) have aroused significant research attentions in recent years. However, the energy-efficient binary spikes do not augur well with gradient descent-based training approaches. Surrogate gradient (SG) strategy is investigated and applied to circumvent this issue and train SNNs from scratch. Due to the lack of well-recognized SG selection rule, most SGs are chosen intuitively. We propose the parametric surrogate gradient (PSG) method to iteratively update SG and eventually determine an optimal surrogate gradient parameter, which calibrates the shape of candidate SGs. In SNNs, neural potential distribution tends to deviate unpredictably due to quantization error. We evaluate such potential shift and propose methodology for potential distribution adjustment (PDA) to minimize the loss of undesired pre-activations. Experimental results demonstrate that the proposed methods can be readily integrated with backpropagation through time (BPTT) algorithm and help modulated SNNs to achieve state-of-the-art performance on both static and dynamic dataset with fewer timesteps.

* 10 pages, 8 figures 
Viaarxiv icon