norse.torch.functional
Contents
norse.torch.functional#
norse.torch.functional
Encoding#
Encodes input currents as fixed (constant) voltage currents, and simulates the spikes that occur during a number of timesteps/iterations (seq_length). |
|
A gaussian radial basis kernel that calculates the radial basis given a distance value (distance between \(x\) and a data value \(x'\), or \(\|\mathbf{x} - \mathbf{x'}\|^2\) below). |
|
Simple euclidean distance metric. |
|
Encodes a set of input values into population codes, such that each singular input value is represented by a list of numbers (typically calculated by a radial basis kernel), whose length is equal to the out_features. |
|
Encodes a tensor of input values, which are assumed to be in the range [0,1] into a tensor of one dimension higher of binary values, which represent input spikes. |
|
Encodes a tensor of input values, which are assumed to be in the range [0,1] into a tensor of binary values, which represent input spikes. |
|
Encodes a tensor of input values, which are assumed to be in the range [-1,1] into a tensor of one dimension higher of binary values, which represent input spikes. |
|
Creates a poisson distributed signed spike vector, when |
|
Encodes an input value by the time the first spike occurs. |
|
For all neurons, remove all but the first spike. |
|
Computes a single euler-integration step of a leaky integrator. |
|
Computes a single euler-integration step of an adaptive exponential LIF neuron-model adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model. |
|
Computes a single euler-integration step of a leaky integrator adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html. |
Logical#
Computes a logical and provided x and y are bitvectors. |
|
Computes a logical xor provided x and y are bitvectors. |
|
Computes a logical or provided x and y are bitvectors. |
|
Computes the muller-c element next state provided x_1 and x_2 are bitvectors and y_prev is the previous state. |
|
Determines whether a transition from 0 to 1 has occured providing that z and z_prev are bitvectors |
Regularization#
Takes one step for a regularizer that aggregates some information (based on the spike_accumulator function), which is pushed forward and returned for future inclusion in an error term. |
|
A spike accumulator that aggregates spikes and returns the total sum as an integer. |
|
A spike accumulator that aggregates membrane potentials over time. |
Threshold functions#
A heaviside step function that truncates numbers <= 0 to 0 and everything else to 1. |
|
Temporal operations#
Creates a lifted version of the given activation function which applies the activation function in the temporal domain. |
Neuron models#
Integrate-and-fire (IAF)#
Parametrization of an integrate-and-fire neuron |
|
State of a feed forward integrate-and-fire neuron |
|
Feedforward step of an integrate-and-fire neuron, computing a single step |
Izhikevich#
Parametrization of av Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Spiking behavior of a Izhikevich neuron |
|
Leaky integrator#
Leaky integrators describe a leaky neuron membrane that integrates incoming currents over time, but never spikes. In other words, the neuron adds up incoming input current, while leaking out some of it in every timestep.
The first equation describes how the membrane voltage (\(v\), across the membrane) changes over time. A constant amount of current is leaked out every timestep (\(v_{\text{leak}}\)), while the current (\(i\)) is added.
The second equation describes how the current flowing into the neuron changes in every timestep.
Notice that both equations are parameterized by the time constant \(\tau\). This constant controls how fast the changes in voltage and current occurs. A large time constant means a small change. In Norse, we call this parameter the inverse to avoid having to recalculate the inverse (\(\tau_{\text{mem_inv}}\) and \(\tau_{\text{syn_inv}}\) respectively). So, for Norse a large inverse time constant means rapid changes while a small inverse time constant means slow changes.
Recall that voltage is the difference in charge between two points (in this case the neuron membrane) and current is the rate of change or the amount of current being added/subtracted at each timestep.
More information can be found on Wikipedia.
Parameters of a leaky integrator |
|
State of a leaky-integrator |
|
Leaky integrate-and-fire (LIF)#
A popular neuron model that combines a norse.torch.functional.leaky_integrator
with
spike thresholds to produce events (spikes).
The model describes the change in a neuron membrane voltage (\(v\))
and inflow current (\(i\)).
See the leaky_integrator
module for more information.
The F in LIF stands for the thresholded “firing” events that occur if the neuron voltage increases over a certain point or threshold (\(v_{\text{th}}\)).
In regular artificial neural networks, this is referred to as the activation
function. The behaviour can be controlled by setting the method
field in
the neuron parameters, but will default to the superspike
synthetic
gradient approach that uses the heaviside
step function:
More information can be found on Wikipedia or in the book *Neuron Dynamics* by W. Gerstner et al., freely available online.
Parametrization of a LIF neuron |
|
State of a feed forward LIF neuron |
|
Computes a single euler-integration step for a lif neuron-model. |
|
Implementes a single euler forward and adjoint backward step of a leaky integrate and fire neuron with current based exponential synapses. |
|
Implementes a single euler forward and adjoint backward step of a leaky integrate and fire neuron with current based exponential synapses. |
LIF, box model#
A simplified version of the popular leaky integrate-and-fire neuron model that combines a norse.torch.functional.leaky_integrator
with spike thresholds to produce events (spikes).
Compared to the norse.torch.functional.lif
modules, this model leaves out the current term, making it computationally simpler but impossible to implement in physical systems because currents cannot “jump” in nature.
It is these sudden current jumps that gives the model its name, because the shift in current is instantaneous and can be drawn as “current boxes”.
State of a feed forward LIF neuron |
|
Parametrization of a boxed LIF neuron |
|
Computes a single euler-integration step for a lif neuron-model without current terms. |
LIF, conductance based#
Parameters of conductance based LIF neuron. |
|
State of a conductance based feed forward LIF neuron. |
|
Euler integration step for a conductance based LIF neuron. |
LIF, adaptive exponential#
Parametrization of an Adaptive Exponential Leaky Integrate and Fire neuron |
|
State of a feed forward LIFAdEx neuron |
|
Computes a single euler-integration step of an adaptive exponential LIF neuron-model adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model. |
|
Computes a single euler-integration step of an adaptive exponential LIF neuron-model adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model. |
LIF, exponential#
Parametrization of an Exponential Leaky Integrate and Fire neuron |
|
State of a feed forward LIFEx neuron |
|
Computes a single euler-integration step of an exponential LIF neuron-model adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html. |
|
Computes a single euler-integration step of a leaky integrator adapted from https://neuronaldynamics.epfl.ch/online/Ch5.S2.html. |
LIF, multicompartmental (MC)#
Computes a single euler-integration feed forward step of a LIF multi-compartment neuron-model. |
|
LIF, refractory#
Parameters of a LIF neuron with absolute refractory period. |
|
State of a feed forward LIF neuron with absolute refractory period. |
|
Computes a single euler-integration step of a feed forward |
|
Implementes a single euler forward and adjoint backward step of a leaky integrate and fire neuron with current based exponential synapses and a refractory period. |
Long short-term memory (LSNN)#
Parameters of an LSNN neuron |
|
Integration state kept for a lsnn module |
|
Euler integration step for LIF Neuron with threshold adaptation. |
|
Implementes a single euler forward and adjoint backward step of a lif neuron with adaptive threshhold and current based exponential synapses. |
Plasticity models#
Spike-time dependent plasticity (STDP)#
Parameters of an STDP sensor as it is used for event driven plasticity rules. |
|
State of an event driven STDP sensor. |
|
Event driven STDP rule. |
Tsodyks-Markram timing-dependent plasticity (TDP)#
Parameters of the Tsodyks-Makram Model |
|
State of the Tsodyks-Makram Model, note that we are tracking the input current state separately. |
|
Euler integration step for Tsodyks Makram model of STP. |