Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Commit

Permalink
[Embedding] Forward given padding_index param to embedding() (#2504)
Browse files Browse the repository at this point in the history
  • Loading branch information
manuyavuz authored and schmmd committed Jul 19, 2019
1 parent 9ed9e2c commit 88a61e1
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions allennlp/modules/token_embedders/embedding.py
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,7 @@ def forward(self, inputs): # pylint: disable=arguments-differ
inputs = util.combine_initial_dims(inputs)

embedded = embedding(inputs, self.weight,
padding_idx=self.padding_index,
max_norm=self.max_norm,
norm_type=self.norm_type,
scale_grad_by_freq=self.scale_grad_by_freq,
Expand Down

0 comments on commit 88a61e1

Please sign in to comment.