Navigation Menu

Skip to content

Commit

Permalink
fixing spacing
Browse files Browse the repository at this point in the history
  • Loading branch information
Sam Wiseman committed Jan 2, 2017
1 parent dd211d0 commit 78e6b4c
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
3 changes: 1 addition & 2 deletions README.md
Expand Up @@ -32,7 +32,6 @@ See below for more details on how to use them.

This project is maintained by [Yoon Kim](http://people.fas.harvard.edu/~yoonkim).
Feel free to post any questions/issues on the issues page.

### Dependencies

#### Python
Expand Down Expand Up @@ -181,7 +180,7 @@ For seq2seq I've found vanilla SGD to work well but feel free to experiment.
* `learning_rate`: Starting learning rate. For 'adagrad', 'adadelta', and 'adam', this is the global
learning rate. Recommended settings vary based on `optim`: sgd (`learning_rate = 1`), adagrad
(`learning_rate = 0.1`), adadelta (`learning_rate = 1`), adam (`learning_rate = 0.1`).
* `layer_lrs`: Comma-separated learning rates for encoder, decoder, and generator when using 'adagrad', 'adadelta', or 'adam' for 'optim' option. Layer-specific learning rates cannot currently be used with sgd.
* `layer_lrs`: Comma-separated learning rates for encoder, decoder, and generator when using 'adagrad', 'adadelta', or 'adam' for 'optim' option. Layer-specific learning rates cannot currently be used with sgd.
* `max_grad_norm`: If the norm of the gradient vector exceeds this, renormalize to have its norm equal to `max_grad_norm`.
* `dropout`: Dropout probability. Dropout is applied between vertical LSTM stacks.
* `lr_decay`: Decay learning rate by this much if (i) perplexity does not decrease on the validation
Expand Down
2 changes: 1 addition & 1 deletion train.lua
Expand Up @@ -946,7 +946,7 @@ function main()
-- parse input params
opt = cmd:parse(arg)

torch.manualSeed(opt.seed);
torch.manualSeed(opt.seed)

if opt.gpuid >= 0 then
print('using CUDA on GPU ' .. opt.gpuid .. '...')
Expand Down

0 comments on commit 78e6b4c

Please sign in to comment.