Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Commit

Permalink
fix shape comments (#3025)
Browse files Browse the repository at this point in the history
  • Loading branch information
zoS3 authored and schmmd committed Jul 9, 2019
1 parent 354a19b commit 64d16ac
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion allennlp/models/encoder_decoders/simple_seq2seq.py
Original file line number Diff line number Diff line change
Expand Up @@ -441,7 +441,7 @@ def _prepare_attended_input(self,
"""Apply attention over encoder outputs and decoder state."""
# Ensure mask is also a FloatTensor. Or else the multiplication within
# attention will complain.
# shape: (batch_size, max_input_sequence_length, encoder_output_dim)
# shape: (batch_size, max_input_sequence_length)
encoder_outputs_mask = encoder_outputs_mask.float()

# shape: (batch_size, max_input_sequence_length)
Expand Down

0 comments on commit 64d16ac

Please sign in to comment.