norse.torch.module.lsnn module¶
Long-short term memory module, building on the work by [G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass](https://github.com/IGITUGraz/LSNN-official).
See norse.torch.functional.lsnn
for more information.
- class norse.torch.module.lsnn.LSNN(p=LSNNParameters(tau_syn_inv=tensor(200.), tau_mem_inv=tensor(100.), tau_adapt_inv=tensor(0.0012), v_leak=tensor(0.), v_th=tensor(1.), v_reset=tensor(0.), beta=tensor(1.8000), method='super', alpha=100.0), adjoint=False, **kwargs)[source]¶
Bases:
norse.torch.module.snn.SNN
A Long short-term memory neuron module without recurrence adapted from https://arxiv.org/abs/1803.09574
- Usage:
>>> from norse.torch import LSNN >>> layer = LSNN() >>> data = torch.zeros(5, 2) >>> output, state = layer.forward(data)
- Parameters
p (LSNNParameters) – The neuron parameters as a torch Module, which allows the module to configure neuron parameters as optimizable.
dt (float) – Time step to use in integration. Defaults to 0.001.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- class norse.torch.module.lsnn.LSNNCell(p=LSNNParameters(tau_syn_inv=tensor(200.), tau_mem_inv=tensor(100.), tau_adapt_inv=tensor(0.0012), v_leak=tensor(0.), v_th=tensor(1.), v_reset=tensor(0.), beta=tensor(1.8000), method='super', alpha=100.0), adjoint=False, **kwargs)[source]¶
Bases:
norse.torch.module.snn.SNNCell
Euler integration cell for LIF Neuron with threshold adaptation without recurrence. More specifically it implements one integration step of the following ODE
\[\begin{split}\begin{align*} \dot{v} &= 1/\tau_{\text{mem}} (v_{\text{leak}} - v + i) \\ \dot{i} &= -1/\tau_{\\text{syn}} i \\ \dot{b} &= -1/\tau_{b} b \end{align*}\end{split}\]together with the jump condition
\[z = \Theta(v - v_{\text{th}} + b)\]and transition equations
\[\begin{split}\begin{align*} v &= (1-z) v + z v_{\text{reset}} \\ i &= i + \text{input} \\ b &= b + \beta z \end{align*}\end{split}\]- Parameters
p (torch.nn.Module) – parameters of the lsnn unit
p – The neuron parameters as a torch Module, which allows the module to configure neuron parameters as optimizable.
dt (float) – Time step to use in integration. Defaults to 0.001.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- class norse.torch.module.lsnn.LSNNRecurrent(input_size, hidden_size, p=LSNNParameters(tau_syn_inv=tensor(200.), tau_mem_inv=tensor(100.), tau_adapt_inv=tensor(0.0012), v_leak=tensor(0.), v_th=tensor(1.), v_reset=tensor(0.), beta=tensor(1.8000), method='super', alpha=100.0), adjoint=False, **kwargs)[source]¶
Bases:
norse.torch.module.snn.SNNRecurrent
A Long short-term memory neuron module wit recurrence adapted from https://arxiv.org/abs/1803.09574
- Usage:
>>> from norse.torch.module import LSNNRecurrent >>> layer = LSNNRecurrent(2, 10) // Shape 2 -> 10 >>> data = torch.zeros(2, 5, 2) // Arbitrary data >>> output, state = layer.forward(data) // Out: (2, 5, 10)
- Parameters
input_size (int) – Size of the input. Also known as the number of input features.
hidden_size (int) – Size of the hidden state. Also known as the number of input features.
p (LSNNParameters) – The neuron parameters as a torch Module, which allows the module to configure neuron parameters as optimizable.
dt (float) – Time step to use in integration. Defaults to 0.001.
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- class norse.torch.module.lsnn.LSNNRecurrentCell(input_size, hidden_size, p=LSNNParameters(tau_syn_inv=tensor(200.), tau_mem_inv=tensor(100.), tau_adapt_inv=tensor(0.0012), v_leak=tensor(0.), v_th=tensor(1.), v_reset=tensor(0.), beta=tensor(1.8000), method='super', alpha=100.0), adjoint=False, **kwargs)[source]¶
Bases:
norse.torch.module.snn.SNNRecurrentCell
Module that computes a single euler-integration step of a LSNN neuron-model with recurrence. More specifically it implements one integration step of the following ODE
\[\begin{split}\\begin{align*} \dot{v} &= 1/\\tau_{\\text{mem}} (v_{\\text{leak}} - v + i) \\\\ \dot{i} &= -1/\\tau_{\\text{syn}} i \\\\ \dot{b} &= -1/\\tau_{b} b \end{align*}\end{split}\]together with the jump condition
\[\begin{split}z = \Theta(v - v_{\\text{th}} + b)\end{split}\]and transition equations
\[\begin{split}\\begin{align*} v &= (1-z) v + z v_{\\text{reset}} \\\\ i &= i + w_{\\text{input}} z_{\\text{in}} \\\\ i &= i + w_{\\text{rec}} z_{\\text{rec}} \\\\ b &= b + \\beta z \end{align*}\end{split}\]where \(z_{\\text{rec}}\) and \(z_{\\text{in}}\) are the recurrent and input spikes respectively.
- Parameters
input_size (int) – Size of the input. Also known as the number of input features.
hidden_size (int) – Size of the hidden state. Also known as the number of input features.
p (LSNNParameters) – parameters of the lsnn unit
dt (float) – Integration timestep to use
Initializes internal Module state, shared by both nn.Module and ScriptModule.