Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is the released pre-trained model including the dual generation pre-training #23

Closed
Robin-Y-Ding opened this issue Dec 14, 2021 · 4 comments

Comments

@Robin-Y-Ding
Copy link

Dear authors,

I noticed in the paper you mentioned that you pre-train the T5 model with identifier-aware denoising for 100 epochs and further pre-train with bimodal generation for 50 epochs. I was wondering the released model only includes the first 100 epochs or the whole 150 epochs?

Thanks in advance for your clarification

@yuewang-cuhk
Copy link
Contributor

Hi, the released model is pre-trained with the only identifier-aware denoising for 100 epochs.

@Robin-Y-Ding
Copy link
Author

I see, thank you for your response. May I ask whether there is any plan to release the 150-epoch pertaining model? I realized there is the latest checkpoint for multi-lingual code summarization, so I wondered whether there will be an incoming model for NL2code generation?

@yuewang-cuhk
Copy link
Contributor

Yes, we are planning to release another checkpoint for NL2code generation using CodeSearchNet. Pls stay tuned:)

@Robin-Y-Ding
Copy link
Author

Great to know! Thanks! Closing the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants