Skip to content

Commit

Permalink
Remove uniform initialisation
Browse files Browse the repository at this point in the history
Instead of uniformly initialising all parameters, don't specify any
initialisation and keep the default module-specific intialisations
provided by pytorch.

TODO: implement more sophisticated intialisation methods.
  • Loading branch information
dieuwkehupkes committed Nov 27, 2018
1 parent 17cb184 commit e7313c4
Showing 1 changed file with 0 additions and 3 deletions.
3 changes: 0 additions & 3 deletions train_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,9 +221,6 @@ def initialize_model(opt, src, tgt, train):
seq2seq = Seq2seq(encoder, decoder)
seq2seq.to(device)

for param in seq2seq.parameters():
param.data.uniform_(-0.08, 0.08)

return seq2seq, input_vocab, output_vocab


Expand Down

0 comments on commit e7313c4

Please sign in to comment.