Skip to content

Commit

Permalink
fix a type mismatch in NAT quantization run
Browse files Browse the repository at this point in the history
Summary:
Fix a type mismatch which was found after patching NAT on top of quantization.
Ning suggested this fix. Need to further understand: why this only appears after patching quantization diff?

Reviewed By: kahne, jhcross

Differential Revision: D18147726

fbshipit-source-id: a51becc9ad58a637a0180074eaa2b46990ab9f84
  • Loading branch information
xianxl authored and facebook-github-bot committed Oct 26, 2019
1 parent c07362c commit eb68afc
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion fairseq/models/model_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ def fill_tensors(x, mask, y, padding_idx: int):
x = expand_2d_or_3d_tensor(x, y.size(1), padding_idx)
x[mask] = y
elif x.size(1) > y.size(1):
x[mask] = torch.tensor(padding_idx)
x[mask] = torch.tensor(padding_idx).type_as(x)
if x.dim() == 2:
x[mask, :y.size(1)] = y
else:
Expand Down

0 comments on commit eb68afc

Please sign in to comment.