You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed in the paper you mentioned that you pre-train the T5 model with identifier-aware denoising for 100 epochs and further pre-train with bimodal generation for 50 epochs. I was wondering the released model only includes the first 100 epochs or the whole 150 epochs?
Thanks in advance for your clarification
The text was updated successfully, but these errors were encountered:
I see, thank you for your response. May I ask whether there is any plan to release the 150-epoch pertaining model? I realized there is the latest checkpoint for multi-lingual code summarization, so I wondered whether there will be an incoming model for NL2code generation?
Dear authors,
I noticed in the paper you mentioned that you pre-train the T5 model with identifier-aware denoising for 100 epochs and further pre-train with bimodal generation for 50 epochs. I was wondering the released model only includes the first 100 epochs or the whole 150 epochs?
Thanks in advance for your clarification
The text was updated successfully, but these errors were encountered: