-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Embed defaultlm #1623
Embed defaultlm #1623
Conversation
I don't know what's going on with espnet/test/test_e2e_compatibility.py Line 40 in 8db0c29
espnet/test/test_e2e_compatibility.py Line 50 in 8db0c29
espnet/test/test_e2e_compatibility.py Line 74 in 8db0c29
it seems a downloading protein model problem. can you guys give me a clue? |
This happens on the other PR as well. |
Codecov Report
@@ Coverage Diff @@
## master #1623 +/- ##
==========================================
+ Coverage 78.08% 78.45% +0.36%
==========================================
Files 127 127
Lines 11774 11991 +217
==========================================
+ Hits 9194 9407 +213
- Misses 2580 2584 +4
Continue to review full report at Codecov.
|
Now it's working (this kind of thing happens several times...) |
@@ -27,6 +27,8 @@ def add_arguments(parser): | |||
help='Number of hidden layers') | |||
parser.add_argument('--unit', '-u', type=int, default=650, | |||
help='Number of hidden units') | |||
parser.add_argument('--embed-unit', type=int, default=128, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently, the default uses args.unit
, so could you also set it like that instead of default=128
? (maybe we can just set with default=None
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, I think about it last night. I agree with change default with 650 keeping same with unit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's better to make default=None
.
Then, it just uses the same dimension as the number of hidden units according to
if n_embed is None: |
Thus, this becomes the exact same behavior unless your new option of
--embed-unit
is specified.If you agree, please change it, and also add such comments in the argument help.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, I can do it, but which place you suggest my add some comment? in parser.add_argument 's help parameter?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes!
Co-Authored-By: Shinji Watanabe <sw005320@gmail.com>
Thanks a lot! |
add to parameter to control embed unit in default rnnlm, method same with transformer LM.