You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear authors,
Thanks for your work!
Your work is really great since there is no any-code completion tool before. I wonder would you release the baseline you designed to compare with AnyCodeGen on Java language (e.g., the retrained code2seq). I find building and retraining them would cause lots of time and energy. Thanks.
The text was updated successfully, but these errors were encountered:
Hi @ShangwenWang,
Thank you for your interest in our paper!
This baseline was a prototype adaptation of code2seq to SLM by taking the data of SLM and converting it to the code2seq format.
The head_paths from SLM are the paths that we used in code2seq.
The target sequence (that should be predicted) in code2seq is the linearized_tree in SLM or target_seq (depending on whether you wish to predict the target as a sequence of tokens or a linearized tree).
Dear @urialon
Thanks for your reply. I wonder will you provide a query link to the adaptation just like you have done for AnyCodeGen.
It is time consuming if retraining the model, not to mention any technical problem I might meet.
Dear authors,
Thanks for your work!
Your work is really great since there is no any-code completion tool before. I wonder would you release the baseline you designed to compare with AnyCodeGen on Java language (e.g., the retrained code2seq). I find building and retraining them would cause lots of time and energy. Thanks.
The text was updated successfully, but these errors were encountered: