Skip to content

Commit

Permalink
Fix hidden_layer size for one-directional decoder (#99)
Browse files Browse the repository at this point in the history
* Fix hidden_layer size for one-directional decoder

Hidden layer size of the decoder was given `hidden_size * 2 if bidirectional else 1`, resulting in a dimensionality error for non-bidirectional decoders.
Changed `1` to `hidden_size`.
  • Loading branch information
dieuwkehupkes authored and kylegao91 committed Nov 20, 2017
1 parent bd3537e commit 626842c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion examples/sample.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ def len_filter(example):
bidirectional = True
encoder = EncoderRNN(len(src.vocab), max_len, hidden_size,
bidirectional=bidirectional, variable_lengths=True)
decoder = DecoderRNN(len(tgt.vocab), max_len, hidden_size * 2 if bidirectional else 1,
decoder = DecoderRNN(len(tgt.vocab), max_len, hidden_size * 2 if bidirectional else hidden_size,
dropout_p=0.2, use_attention=True, bidirectional=bidirectional,
eos_id=tgt.eos_id, sos_id=tgt.sos_id)
seq2seq = Seq2seq(encoder, decoder)
Expand Down

0 comments on commit 626842c

Please sign in to comment.