-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MT task for espnet2 with IWSLT14 recipe #4111
Conversation
fix mt task towards collect_stats at stage 9
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks cool! Some minor comments
Thanks @ftshijt @simpleoier. I have addressed your comments and now just rechecking the entire pipeline! |
@ftshijt @simpleoier done! thanks for the review! |
Many thanks! the next step would just be fixing the CI issues |
Codecov Report
@@ Coverage Diff @@
## master #4111 +/- ##
==========================================
- Coverage 80.94% 80.18% -0.77%
==========================================
Files 435 438 +3
Lines 37651 38012 +361
==========================================
+ Hits 30477 30479 +2
- Misses 7174 7533 +359
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Many thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it looks good to me.
I have minor comments, so please reflect them in the later PR.
Please also summarize TODOs for the further developments (adding tests, tuning the performance, adding models, etc.)
self.embed = torch.nn.Sequential( | ||
pos_enc_class(output_size, positional_dropout_rate) | ||
) | ||
if input_size == output_size: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What’s this?
Can you add a comment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's just to handle cases when input and output size is different by simply adding a linear projection to output size. Will add a comment thanks!
@@ -15,8 +15,6 @@ def test_Encoder_forward_backward(input_layer, positionwise_layer_type): | |||
) | |||
if input_layer == "embed": | |||
x = torch.randint(0, 10, [2, 10]) | |||
elif input_layer is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you remove this test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no need for a special test for input_layer none. It is handled due to the changes I made in transformer_encoder.py
We should also add some documents. |
The roadmap may be updated here #3988 |
Thanks. I've started my initial experiment on IWSLT'14 De-En. |
Hi,
I have prepared MT task for espnet2 along with an IWSLT14 recipe. The code works till the training stage. Current TODOs are -
Thanks to @simpleoier @ftshijt @CoderPat with help at various stages of the build.