Implemented Relational Recurrent neural networks
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
models
utils
.gitignore
LICENSE
README.md
train.py

README.md

Pytorch-Relational-Recurrent-Neural-networks

Adam Santoro, Ryan Faulkner, David Raposo, Jack Rae, Mike Chrzanowski, Theophane Weber, Daan Wierstra, Oriol Vinyals, Razvan Pascanu, Timothy Lillicrap, "Relational recurrent neural networks" arXiv preprint arXiv:1806.01822 (2018).

Meta overview

This repository provides a PyTorch implementation of Relational recurrent neural networks.

Current update status

  • Supervised setting - language modeling
  • [] Supervised setting - Nth farthest problems
  • Tensorboard loggings
  • [] Langue modeling - memory efficient softmax
  • Attention visualization (LSUN Church-outdoor)
  • Implemented core , self attention blocks , data loader

Results

TBD

Prerequisites

Usage

1. Clone the repository

$ git clone https://github.com/cheonbok94/Pytorch-Relational-Recurrent-Neural-networks.git
$ cd Pytorch-Relational-Recurrent-Neural-networks
$ pip install -r requirements.txt 

2. Install datasets (CelebA or LSUN)

$ TBD

3. Train

(1) Train Language modeling
$ python train.py --vocab_file ../data/vocab-2016-09-10.txt --train_prefix='../data/1-billion-word-language-modeling-benchmark-r13output/training-monolingual.tokenized.shuffled/*' --gpu_num 0 --num_epoch 100 --gpu_accelerate --batch_size 6 --bptt 70

4. Test

$ TBD....

5.(optional) Tensorboard logging

Reference