5. Running Tasks#

Norse arrives with a number of built-in examples called tasks. These tasks serve to 1) illustrate what types of tasks can be done with Norse and 2) how to use Norse for specific tasks. Please note that some tasks require additional dependencies, like OpenAI gym for the cartpole task, which is not included in vanilla Norse.

5.1. Parameters#

The tasks below use a large number of configurable parameters to control the model/network size, epochs and batch size, task load, pytorch device, learning rate, and many more parameters.

Another very important parameter determines which backpropagation model to use. This is particularly important for spiking neural network models and is described in more detail on the page about page-learning.

All programs below are built with Abseil Python which gives you extensive command line interface (CLI) help descriptions, where you can look up further descriptions - and find even more - of the parameters above. You can access these by using the --help flag on any tasks below, for instance python -m norse.task.mnist --help.

5.2. Cartpole#

This task is a balancing exercise where a controller learns to counter the gravitational force on an upright cartpole. You will need to install OpenAI Gym, to provide the simulation environments for the robot. If you are using pip, this is as easy as typing pip install gym.

The cartpole task can be run like so:

python3 -m norse.task.cartpole

5.3. Cifar10#

Cifar10 is a labeled database of 60’000 32x32 images in 10 classes. The task is to learn to classify each image.

The cifar task can be run without any additional dependencies like so:

python3 -m norse.task.cifar10

5.4. Correlation experiment#

The correlation experiment serves to demonstrate how neurons can learn patterns with a certain probability. It can be run without any additional dependencies like so:

python3 -m norse.task.correlation_experiment

5.5. Memory task#

The memory task demonstrates how a recurrent spiking neural network can store a pattern and later recall it, similar to the STORE/RECALL experiment in the paper on Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets by Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Robert Legenstein, and Wolfgang Maass.

The task can be run without any additional dependencies like so:

python3 -m norse.task.memory

5.6. MNIST#

MNIST is a database of 70’000 handwritten digits where the task is to learn to classify each image as one of the 10 digits.

The task can be run without any additional dependencies like so:

python3 -m norse.task.mnist

5.7. MNIST in PyTorch Lightning#

This task is similar to the MNIST task above, but built with PyTorch Lightning. PyTorch Lightning is a library to build, train, scale, and verify a model with little overhead. It also provides GPU parallelisation, logging with e. g. Tensorboard, model checkpointing, and much more. We have successfully used PyTorch and Norse on the JUWELS HPC.

Note that the task depends on an installation of PyTorch Lightning: pip install pytorch-lightning

python -m norse.task.mnist_pl

5.8. Speech Command recognition task#

The speech commands dataset as described in linked article serves as an example of a temporal classification task. The corresponding example task can be run like so:

python -m norse.task.speech_commands.run