norse.torch.functional#

Encoding#

Stateless spiking neural network components.

Logical#

logical_and

Computes a logical and provided x and y are bitvectors.

logical_xor

Computes a logical xor provided x and y are bitvectors.

logical_or

Computes a logical or provided x and y are bitvectors.

muller_c

Computes the muller-c element next state provided x_1 and x_2 are bitvectors and y_prev is the previous state.

posedge_detector

Determines whether a transition from 0 to 1 has occured providing that z and z_prev are bitvectors

Regularization#

This module contains functional components for regularization operations on spiking layers, where it can be desirable to regularize spikes, membrane parameters, or other properties over time.

In this functional module, the aim is to collect some state s for each forward step. The collection depends on the output of the layer which, by default, simply just counts spikes. It is the job of the user to include the regularization in an error term later.

Read more on Wikipedia.

regularize_step

Takes one step for a regularizer that aggregates some information (based on the spike_accumulator function), which is pushed forward and returned for future inclusion in an error term.

spike_accumulator

A spike accumulator that aggregates spikes and returns the total sum as an integer.

voltage_accumulator

A spike accumulator that aggregates membrane potentials over time.

Threshold functions#

Stateless spiking neural network components.

heaviside

A heaviside step function that truncates numbers <= 0 to 0 and everything else to 1.

Temporal operations#

Stateless spiking neural network components.

lift

A module for lifting neuron activation functions in time. Simlar to the :module:`.lift`_ module.

Neuron models#

Integrate-and-fire (IAF)#

IAFParameters

Parametrization of an integrate-and-fire neuron

IAFFeedForwardState

State of a feed forward integrate-and-fire neuron

iaf_feed_forward_step

Feedforward step of an integrate-and-fire neuron, computing a single step

Izhikevich#

IzhikevichParameters

Parametrization of av Izhikevich neuron

IzhikevichSpikingBehavior

Spiking behavior of a Izhikevich neuron

tonic_spiking

Spiking behavior of a Izhikevich neuron

tonic_bursting

Spiking behavior of a Izhikevich neuron

phasic_spiking

Spiking behavior of a Izhikevich neuron

phasic_bursting

Spiking behavior of a Izhikevich neuron

mixed_mode

Spiking behavior of a Izhikevich neuron

spike_frequency_adaptation

Spiking behavior of a Izhikevich neuron

class_1_exc

Spiking behavior of a Izhikevich neuron

class_2_exc

Spiking behavior of a Izhikevich neuron

spike_latency

Spiking behavior of a Izhikevich neuron

subthreshold_oscillation

Spiking behavior of a Izhikevich neuron

resonator

Spiking behavior of a Izhikevich neuron

izhikevich_feed_forward_step

Leaky integrator#

Leaky integrators describe a leaky neuron membrane that integrates incoming currents over time, but never spikes. In other words, the neuron adds up incoming input current, while leaking out some of it in every timestep.

\[\begin{split}\begin{align*} \dot{v} &= 1/\tau_{\text{mem}} (v_{\text{leak}} - v + i) \\ \dot{i} &= -1/\tau_{\text{syn}} i \end{align*}\end{split}\]

The first equation describes how the membrane voltage (\(v\), across the membrane) changes over time. A constant amount of current is leaked out every timestep (\(v_{\text{leak}}\)), while the current (\(i\)) is added.

The second equation describes how the current flowing into the neuron changes in every timestep.

Notice that both equations are parameterized by the time constant \(\tau\). This constant controls how fast the changes in voltage and current occurs. A large time constant means a small change. In Norse, we call this parameter the inverse to avoid having to recalculate the inverse (\(\tau_{\text{mem_inv}}\) and \(\tau_{\text{syn_inv}}\) respectively). So, for Norse a large inverse time constant means rapid changes while a small inverse time constant means slow changes.

Recall that voltage is the difference in charge between two points (in this case the neuron membrane) and current is the rate of change or the amount of current being added/subtracted at each timestep.

More information can be found on Wikipedia or in the book *Neuron Dynamics* by W. Gerstner et al., freely available online.

LIParameters

Parameters of a leaky integrator

LIState

State of a leaky-integrator

li_feed_forward_step

Leaky integrate-and-fire (LIF)#

Stateless spiking neural network components.

LIF, box model#

A simplified version of the popular leaky integrate-and-fire neuron model that combines a norse.torch.functional.leaky_integrator with spike thresholds to produce events (spikes). Compared to the norse.torch.functional.lif modules, this model leaves out the current term, making it computationally simpler but impossible to implement in physical systems because currents cannot “jump” in nature. It is these sudden current jumps that gives the model its name, because the shift in current is instantaneous and can be drawn as “current boxes”.

A simplified version of the popular leaky integrate-and-fire neuron model that combines a norse.torch.functional.leaky_integrator with spike thresholds to produce events (spikes). Compared to the norse.torch.functional.lif modules, this model leaves out the current term, making it computationally simpler but impossible to implement in physical systems because currents cannot “jump” in nature. It is these sudden current jumps that gives the model its name, because the shift in current is instantaneous and can be drawn as “current boxes”.

LIFBoxFeedForwardState

LIFBoxParameters

lif_box_feed_forward_step

Computes a single euler-integration step for a lif neuron-model without current terms.

LIF, conductance based#

CobaLIFParameters

Parameters of conductance based LIF neuron.

CobaLIFFeedForwardState

State of a conductance based feed forward LIF neuron.

coba_lif_feed_forward_step

Euler integration step for a conductance based LIF neuron.

LIF, adaptive exponential#

LIFAdExParameters

Parametrization of an Adaptive Exponential Leaky Integrate and Fire neuron

LIFAdExFeedForwardState

State of a feed forward LIFAdEx neuron

lif_adex_feed_forward_step

Computes a single euler-integration step of an adaptive exponential LIF neuron-model adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model.

lif_adex_current_encoder

Computes a single euler-integration step of an adaptive exponential LIF neuron-model adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model.

LIF, exponential#

LIFExParameters

Parametrization of an Exponential Leaky Integrate and Fire neuron

LIFExFeedForwardState

State of a feed forward LIFEx neuron

lif_ex_feed_forward_step

Computes a single euler-integration step of an exponential LIF neuron-model adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html.

lif_ex_current_encoder

Computes a single euler-integration step of a leaky integrator adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html.

LIF, multicompartmental (MC)#

Stateless spiking neural network components.

LIF, refractory#

Stateless spiking neural network components.

Long short-term memory (LSNN)#

Stateless spiking neural network components.

Receptive fields#

A module for creating receptive fields.

gaussian_kernel(size, c, x, y[, domain])

Efficiently creates a differentiable 2d gaussian kernel.

spatial_receptive_field(scale, angle, ratio, ...)

Creates a (size x size) receptive field kernel at a given scale, angle and ratio with respect to x and y derivatives.

spatial_receptive_fields_with_derivatives(...)

Creates a number of receptive fields based on the spatial parameters and size of the receptive field.

temporal_scale_distribution(n_scales[, ...])

Provides temporal scales according to [Lindeberg2016].

Plasticity models#

Spike-time dependent plasticity (STDP)#

Stateless spiking neural network components.

Tsodyks-Markram timing-dependent plasticity (TDP)#

Stateless spiking neural network components.