norse.torch.functional.lif_adex.lif_adex_feed_forward_step

norse.torch.functional.lif_adex.lif_adex_feed_forward_step(input_tensor: torch.Tensor, state: norse.torch.functional.lif_adex.LIFAdExFeedForwardState = LIFAdExFeedForwardState(v=0, i=0, a=0), p: norse.torch.functional.lif_adex.LIFAdExParameters = LIFAdExParameters(adaptation_current=tensor(4), adaptation_spike=tensor(0.0200), delta_T=tensor(0.5000), tau_ada_inv=tensor(2.), tau_syn_inv=tensor(200.), tau_mem_inv=tensor(100.), v_leak=tensor(0.), v_th=tensor(1.), v_reset=tensor(0.), method='super', alpha=100.0), dt: float = 0.001)Tuple[torch.Tensor, norse.torch.functional.lif_adex.LIFAdExFeedForwardState][source]

Computes a single euler-integration step of an adaptive exponential LIF neuron-model adapted from http://www.scholarpedia.org/article/Adaptive_exponential_integrate-and-fire_model. It takes as input the input current as generated by an arbitrary torch module or function. More specifically it implements one integration step of the following ODE

\[\begin{split}\begin{align*} \dot{v} &= 1/\tau_{\text{mem}} \left(v_{\text{leak}} - v + i + \Delta_T exp\left({{v - v_{\text{th}}} \over {\Delta_T}}\right)\right) \\ \dot{i} &= -1/\tau_{\text{syn}} i \\ \dot{a} &= 1/\tau_{\text{ada}} \left( a_{current} (V - v_{\text{leak}}) - a \right) \end{align*}\end{split}\]

together with the jump condition

\[z = \Theta(v - v_{\text{th}})\]

and transition equations

\[\begin{split}\begin{align*} v &= (1-z) v + z v_{\text{reset}} \\ i &= i + i_{\text{in}} \\ a &= a + a_{\text{spike}} z_{\text{rec}} \end{align*}\end{split}\]

where \(i_{\text{in}}\) is meant to be the result of applying an arbitrary pytorch module (such as a convolution) to input spikes.

Parameters:

input_tensor (torch.Tensor): the input spikes at the current time step state (LIFAdExFeedForwardState): current state of the LIF neuron p (LIFAdExParameters): parameters of a leaky integrate and fire neuron dt (float): Integration timestep to use