This project is the correction, update and continuation of the group project done for the Deep Learning Course at ETH Zurich in Fall '21. The original project had received a grade of 5.775/6.0, whereas a grade of 6.0 by ETH Zurich standards implies "Good enough for submission to an international conference"; it can be found here. In this project experiments were re-run for a slighly more general version of the ESN, as well as new insight into the model performance is provided, alongside with updated plots/results. For more information, please check the report.
Abstract: Chaotic dynamical systems continue to puzzle and amaze practitioners due to their inherent unpredictability, despite their finite and concise representations. In spite of its simplicity, Reservoir Computing (H. Jaeger et al.) has been demonstrated to be well-equipped at the task of predicting the trajectories of chaotic systems where more intricate and computationally intensive Deep Learning methods have failed, but it has so far only been evaluated on a small and selected set of chaotic systems (P.R. Vlachas et al.). We build and evaluate the performance of a Reservoir Computing model known as the Echo State Network (H. Jaeger et al.) on a large collection of chaotic systems recently published by W. Gilpin and show that ESN does in fact beat all but the top approach out of the 16 forecasting baselines reported by the author.
virtualenv rc --python=python3.7
source rc/bin/activate
pip install -r requirements.txt
To run with best found hyperparameters, run:
python main.py
To get solid performance for a single hyperparameter setting, run:
python main.py --reservoir_size=1000 --radius=0.9 --sparsity=0.1 --alpha=1.0 --reg=1e-7 --seed=10