Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embed defaultlm #1623

Merged
merged 17 commits into from
Feb 29, 2020
Merged

Embed defaultlm #1623

merged 17 commits into from
Feb 29, 2020

Conversation

qmpzzpmq
Copy link
Contributor

add to parameter to control embed unit in default rnnlm, method same with transformer LM.

@qmpzzpmq qmpzzpmq closed this Feb 27, 2020
@qmpzzpmq qmpzzpmq reopened this Feb 27, 2020
@qmpzzpmq
Copy link
Contributor Author

I don't know what's going on with

subprocess.run(cmd, check=True)

r'<a id="uc-download-link" [^>]* href="([^"]*)">', out)[0].replace('&amp;', '&'))

model_path = download_zip_from_google_drive(tmpdir, download_info[1])

it seems a downloading protein model problem.
can you guys give me a clue?

@sw005320
Copy link
Contributor

This happens on the other PR as well.
I could manually download the zip file, so it might be a tentative issue in the Google drive.
Please ignore it for now.

@codecov
Copy link

codecov bot commented Feb 27, 2020

Codecov Report

Merging #1623 into master will increase coverage by 0.36%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #1623      +/-   ##
==========================================
+ Coverage   78.08%   78.45%   +0.36%     
==========================================
  Files         127      127              
  Lines       11774    11991     +217     
==========================================
+ Hits         9194     9407     +213     
- Misses       2580     2584       +4
Impacted Files Coverage Δ
espnet/nets/pytorch_backend/lm/default.py 90.83% <100%> (+0.36%) ⬆️
espnet/nets/pytorch_backend/e2e_asr_transformer.py 94.79% <0%> (-0.17%) ⬇️
espnet/nets/pytorch_backend/e2e_st_transformer.py 85.5% <0%> (+0.76%) ⬆️
espnet/nets/pytorch_backend/rnn/decoders.py 95.04% <0%> (+1.42%) ⬆️
espnet/nets/pytorch_backend/e2e_mt_transformer.py 78.34% <0%> (+1.69%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8db0c29...097b51d. Read the comment docs.

@sw005320
Copy link
Contributor

Now it's working (this kind of thing happens several times...)

@sw005320 sw005320 added this to the v.0.6.3 milestone Feb 28, 2020
@@ -27,6 +27,8 @@ def add_arguments(parser):
help='Number of hidden layers')
parser.add_argument('--unit', '-u', type=int, default=650,
help='Number of hidden units')
parser.add_argument('--embed-unit', type=int, default=128,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, the default uses args.unit, so could you also set it like that instead of default=128? (maybe we can just set with default=None?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, I think about it last night. I agree with change default with 650 keeping same with unit.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's better to make default=None.
Then, it just uses the same dimension as the number of hidden units according to


Thus, this becomes the exact same behavior unless your new option of --embed-unit is specified.
If you agree, please change it, and also add such comments in the argument help.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, I can do it, but which place you suggest my add some comment? in parser.add_argument 's help parameter?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes!

Co-Authored-By: Shinji Watanabe <sw005320@gmail.com>
@sw005320
Copy link
Contributor

Thanks a lot!

@sw005320 sw005320 merged commit 9bcab95 into espnet:master Feb 29, 2020
@qmpzzpmq qmpzzpmq deleted the embed_defaultlm branch April 28, 2020 13:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants