EchoTorch is a python module based on PyTorch to implement and test various flavours of Echo State Network models. EchoTorch is not intended to be put into production but for research purposes. As it is based on PyTorch, EchoTorch's layers are designed to be integrated into deep architectures for future work and research.
Join our community to create datasets and deep-learning models! Chat with us on Gitter and join the Google Group to collaborate with us.
This repository consists of:
- echotorch.datasets : Pre-built datasets for common ESN tasks.
- echotorch.evaluation : Tools and functions to evaluate and compare ESN models (cross-validation, statistical tests, etc).
- echotorch.models : Ready to train models and generic pre-trained ESN models.
- echotorch.nn : All neural network Torch components for ESN and Reservoir Computing.
- echotorch.transforms : Data transformations specific to ESN.
- echotorch.utils : Tools, functions and measures for ESN and Reservoir Computing.
- echotorch.utils.conceptors : Utility classes and functions in relation with conceptor neural filters.
- echotorch.utils.matrix_generation : Class to generate different matrices to be used in ESNs.
- echotorch.utils.optimization : Implementation of classical optimization algorithms for hyperparameters optimization.
- echotorch.utils.visualisation : Various classes and functions for data and model visualisation.
Here is some examples of what you can do with EchoTorch.
- Conceptors
- Boolean operations : Boolean operations with Conceptors.
- Pattern evidences : Evidence gathering for pattern classification with Conceptors.
- Four patterns generation : load into a reservoir four patterns and re-generate them with conceptor-based neural filtering.
- Incremental loading and memory management : how to load patterns in ESN's memory incrementally and manage memory usage.
- Memory management :
- Memory management and increament forgetting : Load patterns in ESN's memory with possible to erase old patterns (in research).
- Morphing periodic patterns : Learning, generating and morphing a set of periodic patterns.
- Morphing periodic sine : Learning, generating and morphing a set of sine patterns.
- Morphing random patterns : Learning, generating and morphing a set of random patterns.
- Morphing random sine : Learning, generating and morphing a set of sine patterns with random periods.
- Morphing sines :
- Morphing square : Learn four patterns and visualize multiple morphed patterns.
- Subspace demo : Show how patterns populate the space of reservoir state and how to define them with neural filters.
- Datasets
- Latch-Copy-Repeat : How to generate data for three well-known tasks in Machine Learning.
- Logistic Map : Generate data from the logistic map function.
- MNIST_images : Load images from the MNIST dataset.
- NARMA : Generate NARMA timeseries.
- Strange attractors : Generate timeseries data from common strange attractors.
- Timeseries batch sequencing : Transform a timeseries in sequences of specific length (to train a FFNN for example).
- Timeseries triplet batching : Get triplet from an anchor, a positive example (same class), and a negative example (no the same class) to train similarity measures.
- Evaluation
- Fold cross-validation : How to perform 10-fold cross validation.
- Generation
- NARMA-10 generation with feedbacks : Generate NARMA-10 timeseries with feedbacks.
- Matrix generation
- Cycle with jumps : Generation of a matrix composed of a cycle with jumps (Rodan and Tino, 2012). (To write)
- Normal matrix : Generation based on a Gaussian distribution. (to write)
- Uniform matrix : Generation based on an uniform distribution. (to write)
- Memory
- Memtest : Test the capacity of an ESN to memorize random inputs.
- MNIST
- Image to timeseries conversion : How to convert images to timeseries.
- Nodes
- Independent Component Analysis : How to do Independent Component Analysis (ICA) with EchoTorch.
- Principal Component Analysis : How to do Principal Component Analysis (PCA) with EchoTorch.
- Slow Feature Analysis : How to do Slow Features Analysis (SFA) with EchoTorch.
- Optimization
- Genetic search : Optimize hyper-parameters with a genetic algorithm.
- Grid search : Optimize hyper-parameters with a grid search.
- Random search : Generate models withs random parameters and find the best.
- Switch between attractors
- Switch Attractor : Test the capacity of a simple ESN to switch between attractors.
- Timeseries prediction
- Mackey Glass : Mackey-Glass timeseries prediction with ESN.
- NARMA-10 : NARMA-10 timeseries prediction with ESN and original training methods (ridge regression).
- NARMA-10 for reservoir sizes : NARMA-10 timeseries prediction with ESN and different reservoir sizes.
- NARMA-10 with gradient descent : NARMA-10 timeseries prediction with ESN and gradient descent (it doesn't work, see tutorials).
- NARMA-10 with Gated-ESN : NARMA-10 prediction with Gated-ESN (ESN + PCA + LSTM).
- NARMA-10 with Stacked-ESN : NARMA-10 prediction with Stacked-ESN.
- Unsupervised Learning
In addition to examples, here are some Jupyter tutorials to learn how Reservoir Computing works.
- Timeseries prediction
- NARMA10 : Train an ESN to predict a timeseries based on NARMA10 (to write).
- Images classification
- MNIST classification : Classify handwritten digit images from the MNIST dataset (to write).
Here are some experimences done with ESN and reproduced with EchoTorch :
- Echo State Networks-Based Reservoir Computing for MNIST Handwritten Digits Recognition
- Controlling RNNs by Conceptors (Herbert Jaeger) :
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
You need to following package to install EchoTorch.
- sphinx_bootstrap_theme
- future
- numpy
- scipy
- scikit-learn
- matplotlib
- torch>=1.3.0
- torchvision>=0.4.1
pip install EchoTorch
- Nils Schaetti - Initial work - nschaetti
This project is licensed under the GPLv3 License - see the LICENSE file for details.
If you find EchoTorch useful for an academic publication, then please use the following BibTeX to cite it:
@misc{echotorch,
author = {Schaetti, Nils},
title = {EchoTorch: Reservoir Computing with pyTorch},
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/nschaetti/EchoTorch}},
}
You can simply create an ESN with the ESN or LiESN objects in the nn module.
esn = etnn.LiESN(
input_dim,
n_hidden,
output_dim,
spectral_radius,
learning_algo='inv',
leaky_rate=leaky_rate
)
Where
- input_dim is the input dimensionality;
- h_hidden is the size of the reservoir;
- output_dim is the output dimensionality;
- spectral_radius is the spectral radius with a default value of 0.9;
- learning_algo allows you to choose with training algorithms to use. The possible values are inv, LU and sdg;
You now just have to give the ESN the inputs and the attended outputs.
for data in trainloader:
# Inputs and outputs
inputs, targets = data
# To variable
inputs, targets = Variable(inputs), Variable(targets)
# Give the example to EchoTorch
esn(inputs, targets)
# end for
After giving all examples to EchoTorch, you just have to call the finalize method.
esn.finalize()
The model is now trained and you can call the esn object to get a prediction.
predicted = esn(test_input)