Skip to content

lrjconan/RBP

master
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

RBP

This is the PyTorch implementation of Recurrent Back Propagation as described in the following ICML 2018 paper:

@article{liao2018reviving,
  title={Reviving and Improving Recurrent Back-Propagation},
  author={Liao, Renjie and Xiong, Yuwen and Fetaya, Ethan and Zhang, Lisa and Yoon, KiJung and Pitkow, Xaq and Urtasun, Raquel and Zemel, Richard},
  journal={arXiv preprint arXiv:1803.06396},
  year={2018}
}

Setup

To set up experiments, we need to build our customized operators by running the following scripts:

./setup.sh

Dependencies

Python 3, PyTorch(0.4.0)

Run Demos

  • To run experiments X where X is one of {hopfield, cora, pubmed, hypergrad}:

    python run_exp.py -c config/X.yaml

Notes:

  • Most hyperparameters in the configuration yaml file are self-explanatory.
  • To switch between BPTT, TBPTT and RBP variants, you need to specify grad_method in the config file.
  • Conjugate gradient based RBP requires support of forward mode auto-differentiation which we only provided for the experiments of Hopfield networks and graph neural networks (GNNs). You can check the comments in model/rbp.py for more details.

Cite

Please cite our paper if you use this code in your research work.

Questions/Bugs

Please submit a Github issue or contact rjliao@cs.toronto.edu if you have any questions or find any bugs.

About

Recurrent Back Propagation, Back Propagation Through Optimization, ICML 2018

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published