norse.torch

Building blocks for spiking neural networks based on PyTorch.

Containers

Lift

Lift applies a given torch.nn.Module over

SequentialState

A sequential model that works like PyTorch's Sequential with the addition that it handles neuron states.

RegularizationCell

A regularisation cell that accumulates some state (for instance number of spikes) for each forward step, which can later be applied to a loss term.

Encoding

ConstantCurrentLIFEncoder

Encodes input currents as fixed (constant) voltage currents, and simulates the spikes that occur during a number of timesteps/iterations (seq_length).

PoissonEncoder

Encodes a tensor of input values, which are assumed to be in the range [0,1] into a tensor of one dimension higher of binary values, which represent input spikes.

PoissonEncoderStep

Encodes a tensor of input values, which are assumed to be in the range [0,1] into a tensor of binary values, which represent input spikes.

PopulationEncoder

Encodes a set of input values into population codes, such that each singular input value is represented by a list of numbers (typically calculated by a radial basis kernel), whose length is equal to the out_features.

SignedPoissonEncoder

Encodes a tensor of input values, which are assumed to be in the range [-1,1] into a tensor of one dimension higher of values in {-1,0,1}, which represent signed input spikes.

SpikeLatencyEncoder

For all neurons, remove all but the first spike.

SpikeLatencyLIFEncoder

Encodes an input value by the time the first spike occurs.

Convolutions

LConv2d

Implements a 2d-convolution applied pointwise in time.

Neuron models

Integrate-and-fire

Simple integrators that sums up incoming signals until a threshold.

IAFFeedForwardState

State of a feed forward integrate-and-fire neuron

IAFParameters

Parametrization of an integrate-and-fire neuron

IAFCell

Izhikevich

IzhikevichState

State of a Izhikevich neuron

IzhikevichSpikingBehavior

Spiking behavior of a Izhikevich neuron

Izhikevich

A neuron layer that wraps a IzhikevichCell in time such that the layer keeps track of temporal sequences of spikes. After application, the layer returns a tuple containing (spikes from all timesteps, state from the last timestep).

IzhikevichCell

Module that computes a single Izhikevich neuron-model without recurrence and without time.

IzhikevichRecurrent

A neuron layer that wraps a IzhikevichRecurrentCell in time such that the layer keeps track of temporal sequences of spikes. After application, the layer returns a tuple containing (spikes from all timesteps, state from the last timestep).

IzhikevichRecurrentCell

Module that computes a single euler-integration step of an Izhikevich neuron-model with recurrence but without time.

Leaky integrator

Leaky integrators describe a leaky neuron membrane that integrates incoming currents over time, but never spikes. In other words, the neuron adds up incoming input current, while leaking out some of it in every timestep.

\[\begin{split}\begin{align*} \dot{v} &= 1/\tau_{\text{mem}} (v_{\text{leak}} - v + i) \\ \dot{i} &= -1/\tau_{\text{syn}} i \end{align*}\end{split}\]

The first equation describes how the membrane voltage (\(v\), across the membrane) changes over time. A constant amount of current is leaked out every timestep (\(v_{\text{leak}}\)), while the current (\(i\)) is added.

The second equation describes how the current flowing into the neuron changes in every timestep.

Notice that both equations are parameterized by the time constant \(\tau\). This constant controls how fast the changes in voltage and current occurs. A large time constant means a small change. In Norse, we call this parameter the inverse to avoid having to recalculate the inverse (\(\tau_{\text{mem_inv}}\) and \(\tau_{\text{syn_inv}}\) respectively). So, for Norse a large inverse time constant means rapid changes while a small inverse time constant means slow changes.

Recall that voltage is the difference in charge between two points (in this case the neuron membrane) and current is the rate of change or the amount of current being added/subtracted at each timestep.

More information can be found on Wikipedia.

LIState

State of a leaky-integrator

LIParameters

Parameters of a leaky integrator

LI

A neuron layer that wraps a leaky-integrator LICell in time, but without recurrence.

LICell

Cell for a leaky-integrator without recurrence.

LILinearCell

Cell for a leaky-integrator with an additional linear weighting.

Leaky integrate-and-fire (LIF)

A popular neuron model that combines a norse.torch.functional.leaky_integrator with spike thresholds to produce events (spikes).

The model describes the change in a neuron membrane voltage (\(v\)) and inflow current (\(i\)). See the leaky_integrator module for more information.

\[\begin{split}\begin{align*} \dot{v} &= 1/\tau_{\text{mem}} (v_{\text{leak}} - v + i) \\ \dot{i} &= 1/\tau_{\text{syn}} i \end{align*}\end{split}\]

The F in LIF stands for the thresholded “firing” events that occur if the neuron voltage increases over a certain point or threshold (\(v_{\text{th}}\)).

\[z = \Theta(v - v_{\text{th}})\]

In regular artificial neural networks, this is referred to as the activation function. The behaviour can be controlled by setting the method field in the neuron parameters, but will default to the superspike synthetic gradient approach that uses the heaviside step function:

\[\begin{split}H[n]=\begin{cases} 0, & n <= 0 \\ 1, & n \gt 0 \end{cases}\end{split}\]

More information can be found on Wikipedia or in the book *Neuron Dynamics* by W. Gerstner et al., freely available online.

LIFParameters

Parametrization of a LIF neuron

LIFState

State of a LIF neuron

LIF

A neuron layer that wraps a LIFCell in time such that the layer keeps track of temporal sequences of spikes. After application, the layer returns a tuple containing (spikes from all timesteps, state from the last timestep).

LIFCell

Module that computes a single euler-integration step of a leaky integrate-and-fire (LIF) neuron-model without recurrence and without time.

LIFRecurrent

A neuron layer that wraps a LIFRecurrentCell in time such that the layer keeps track of temporal sequences of spikes. After application, the module returns a tuple containing (spikes from all timesteps, state from the last timestep).

LIFRecurrentCell

Module that computes a single euler-integration step of a leaky integrate-and-fire (LIF) neuron-model with recurrence but without time.

LIF, box model

A simplified version of the popular leaky integrate-and-fire neuron model that combines a norse.torch.functional.leaky_integrator with spike thresholds to produce events (spikes). Compared to the norse.torch.functional.lif modules, this model leaves out the current term, making it computationally simpler but impossible to implement in physical systems because currents cannot “jump” in nature. It is these sudden current jumps that gives the model its name, because the shift in current is instantaneous and can be drawn as “current boxes”.

LIFBoxFeedForwardState

State of a feed forward LIF neuron

LIFBoxParameters

Parametrization of a boxed LIF neuron

LIFBoxCell

Computes a single euler-integration step for a lif neuron-model without current terms.

LIF, conductance based

CobaLIFCell

Module that computes a single euler-integration step of a conductance based LIF neuron-model.

LIF, adaptive exponential

LIFAdEx

A neuron layer that wraps a recurrent LIFAdExCell in time such that the layer keeps track of temporal sequences of spikes. After application, the layer returns a tuple containing (spikes from all timesteps, state from the last timestep).

LIFAdExCell

Computes a single euler-integration step of a feed-forward exponential LIF neuron-model without recurrence, adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model.

LIFAdExRecurrent

A neuron layer that wraps a recurrent LIFAdExRecurrentCell in time (with recurrence) such that the layer keeps track of temporal sequences of spikes. After application, the layer returns a tuple containing (spikes from all timesteps, state from the last timestep).

LIFAdExRecurrentCell

Computes a single of euler-integration step of a recurrent adaptive exponential LIF neuron-model with recurrence, adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model.

LIF, exponential

LIFEx

A neuron layer that wraps a LIFExCell in time such that the layer keeps track of temporal sequences of spikes. After application, the layer returns a tuple containing (spikes from all timesteps, state from the last timestep).

LIFExCell

Computes a single euler-integration step of a recurrent exponential LIF neuron-model (without recurrence) adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html.

LIFExRecurrent

A neuron layer that wraps a LIFExRecurrentCell in time such that the layer keeps track of temporal sequences of spikes. After application, the module returns a tuple containing (spikes from all timesteps, state from the last timestep).

LIFExRecurrentCell

Computes a single euler-integration step of a recurrent exponential LIFEx neuron-model (with recurrence) adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html.

LIF, multicompartmental

LIFMCRecurrentCell

Computes a single euler-integration step of a LIF multi-compartment neuron-model.

LIF, refractory

LIFRefracCell

Module that computes a single euler-integration step of a LIF neuron-model with absolute refractory period without recurrence.

LIFRefracRecurrentCell

Module that computes a single euler-integration step of a LIF neuron-model with absolute refractory period.

Long short-term memory (LSNN)

LSNN

A Long short-term memory neuron module without recurrence adapted from https://arxiv.org/abs/1803.09574

LSNNCell

Euler integration cell for LIF Neuron with threshold adaptation without recurrence.

LSNNRecurrent

A Long short-term memory neuron module wit recurrence adapted from https://arxiv.org/abs/1803.09574

LSNNRecurrentCell

Module that computes a single euler-integration step of a LSNN neuron-model with recurrence.