Skip to content
/ eve Public

Source code for Eve: A Gradient Based Optimization Method with Locally and Globally Adaptive Learning Rates

License

Notifications You must be signed in to change notification settings

rooa/eve

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

eve

This repository contains code and results for the paper

Eve: A Gradient Based Optimization Method with Locally and Globally Adaptive Learning Rates
Hiroaki Hayashi*, Jayanth Koushik*, Graham Neubig
(* equal contribution)

Setup

A conda environment to run experiments can be created by running

conda env create -f environment.yml

The environment is activated/deactivated using

source activate eve
source deactivate eve

A suitable Keras backend is required for GPU support. Refer to the documentation for instructions.

Code

A Keras implementation of the algorithm is in eve/optim/eve.py. The Eve class in this file can be passed to the model.compile method in Keras (using the optimizer argument).

Scripts for the various experiments are inside eve/exp/runners. Run these scripts from the root directory, as such:

python -m eve.exp.runners.compsched --help

The help command provides information about choices for the various arguments. Learning rate schedules and datasets are referred to by their class names (in eve/exp/lrscheds.py and eve/exp/datasets.py respectively). Arguments to these classes are passed through the command line as json strings. Refer to the paper for values used in our experiments.

Citation

@article{hayashi2017eve,
  title={Eve: A Gradient Based Optimization Method with Locally and Globally Adaptive Learning Rates},
  author={Hayashi, Hiroaki and Koushik, Jayanth and Neubig, Graham},
  journal={arXiv preprint arXiv:1611.01505},
  year={2017}
}

About

Source code for Eve: A Gradient Based Optimization Method with Locally and Globally Adaptive Learning Rates

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published