You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is not a real bug, I rather want to discuss a recent change in master.
Situation: My (transformer) encoder sub-class needs additional input during the forward path. Currently I'm feeding this to my encoder by adding it to the net_input dictionary. During validation/generation I need the additional input as well. This worked out of the box for generation, because the generator only used to forward the net_input to the encoder:
With a recent change of the SequenceGenerator this is not possible anymore. The generator now only feeds src_tokens and src_lengths to the encoder (probably for torchscript):
Summary:
# Before submitting
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [x] Did you read the [contributor guideline](https://github.com/pytorch/fairseq/blob/master/CONTRIBUTING.md)?
- [ ] Did you make sure to update the docs?
- [x] Did you write any new necessary tests?
## What does this PR do?
Fixesfacebookresearch#2022.
## PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
## Did you have fun?
Make sure you had fun coding �
Pull Request resolved: facebookresearch#2090
Reviewed By: cndn
Differential Revision: D21385984
Pulled By: myleott
fbshipit-source-id: 1428e02e625b8625df71a83c05dcf933c3f899df
mgaido91
added a commit
to mgaido91/FBK-fairseq-ST
that referenced
this issue
Jan 12, 2021
Summary:
# Before submitting
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [x] Did you read the [contributor guideline](https://github.com/pytorch/fairseq/blob/master/CONTRIBUTING.md)?
- [ ] Did you make sure to update the docs?
- [x] Did you write any new necessary tests?
## What does this PR do?
Fixesfacebookresearch/fairseq#2022.
## PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
## Did you have fun?
Make sure you had fun coding �
Pull Request resolved: facebookresearch/fairseq#2090
Reviewed By: cndn
Differential Revision: D21385984
Pulled By: myleott
fbshipit-source-id: 1428e02e625b8625df71a83c05dcf933c3f899df
🐛 Bug
This is not a real bug, I rather want to discuss a recent change in master.
Situation: My (transformer) encoder sub-class needs additional input during the forward path. Currently I'm feeding this to my encoder by adding it to the
net_input
dictionary. During validation/generation I need the additional input as well. This worked out of the box for generation, because the generator only used to forward thenet_input
to the encoder:https://github.com/pytorch/fairseq/blob/630701eaa750efda4f7aeb1a6d693eb5e690cab1/fairseq/sequence_generator.py#L133
With a recent change of the
SequenceGenerator
this is not possible anymore. The generator now only feedssrc_tokens
andsrc_lengths
to the encoder (probably for torchscript):https://github.com/pytorch/fairseq/blob/57526c63433c0b1c997fc91c0881867532567266/fairseq/sequence_generator.py#L197-L200
This change requires me to sub-class the generator and add the input there as well.
Expected behavior
I should be able to add additional input to my encoder, without having to sub-class the generator.
Environment
The text was updated successfully, but these errors were encountered: