This library features:
- A Neural Turing Machine layer
NTMLayer, where all its components (controller, heads, memory) are fully customizable.
- Two types of controllers: a feed-forward
DenseControllerand a "vanilla" recurrent
- A dashboard to visualize the inner mechanism of the NTM.
- Generators to sample examples from algorithmic tasks.
To avoid any conflict with your existing Python setup, and to keep this project self-contained, it is suggested to work in a virtual environment with
virtualenv. To install
sudo pip install --upgrade virtualenv
Create a virtual environment called
venv, activate it and install the requirements given by
requirements.txt. NTM-Lasagne requires the bleeding-edge version, check the Lasagne installation instructions for details. The latest version of Lasagne is included in the
virtualenv venv source venv/bin/activate pip install -r requirements.txt pip install .
Here is minimal example to define a
# Neural Turing Machine Layer memory = Memory((128, 20), memory_init=lasagne.init.Constant(1e-6), learn_init=False, name='memory') controller = DenseController(l_input, memory_shape=(128, 20), num_units=100, num_reads=1, nonlinearity=lasagne.nonlinearities.rectify, name='controller') heads = [ WriteHead(controller, num_shifts=3, memory_shape=(128, 20), nonlinearity_key=lasagne.nonlinearities.rectify, nonlinearity_add=lasagne.nonlinearities.rectify, learn_init=False, name='write'), ReadHead(controller, num_shifts=3, memory_shape=(128, 20), nonlinearity_key=lasagne.nonlinearities.rectify, learn_init=False, name='read') ] l_ntm = NTMLayer(l_input, memory=memory, controller=controller, heads=heads)
For more detailed examples, check the
examples folder. If you would like to train a Neural Turing Machine on one of these examples, simply run the corresponding script, like
PYTHONPATH=. python examples/copy-task.py
This projects has a few basic tests. To run these tests, you can run the
py.test on the project folder
venv/bin/py.test ntm -vv
Graph optimization is computationally intensive. If you are encountering suspiciously long compilation times (more than a few minutes), you may need to increase the amount of memory allocated (if you run it on a Virtual Machine). Alternatively, turning off the swap may help for debugging (with
Note: Unlucky initialisation of the parameters might lead to a diverging solution (
Alex Graves, Greg Wayne, Ivo Danihelka, Neural Turing Machines, [arXiv]
Please see the Contribution Guidelines.