Skip to content

Theano implementation of Learning to Remember Rare Events

Notifications You must be signed in to change notification settings

wohahaKawai/Theano-LtRRE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 

Repository files navigation

Theano implementation of Learning to Remember Rare Events by Kaiser et al. (https://arxiv.org/abs/1703.03129)

The memory module acts as a standalone component to a neural network, and serves as a medium for the network to store and retrieve external information.

An analogy for this module is to think of it as a differentiable dictionary, where the keys of the dictionary are learned by the neural network. These keys represent high-level features of an input, and may be shared across many different inputs (i.e. so long as these features are similar). The values these keys are bound to represent the class labels of a given sample.

Dependencies:

To run the example:

python mnist.py

You should see something like:

Starting training...
Epoch 1 of 2 took 56.404s
  training loss:                0.042020
  validation loss:              0.024000
  validation accuracy:          96.88 %
Epoch 2 of 2 took 55.051s
  training loss:                0.022224
  validation loss:              0.018059
  validation accuracy:          97.65 %

Known limitations:

Due to the dependence of T.argsort for finding the k-nearest neighbours, there is some inefficiency for larger memory sizes (e.g. 10k) due to the entire memory being sorted. Some speedup can likely be achieved either via a theano wrapper to numpy.argpartition, or during training, by calculating the neighbours on the CPU and passing them to the training function.

About

Theano implementation of Learning to Remember Rare Events

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%