Skip to content

a unified and comprehensive code representation library

License

Notifications You must be signed in to change notification settings

nchen909/HugCode

Repository files navigation

HugCode (Under Development)

HugNLP friend library. Developed by Nuo Chen.

Capacities (Currently)

CodePTMs:

full finetuning

parameter-efficient learning

fewshot learning

Environment & Preparing

conda create --name cat python=3.7
conda activate cat
pip install -r requirements.txt
git clone https://github.com/nchen909/CodePrompt
cd CodePrompt/metrics/CodeBLEU/parser
bash build.sh
cd ../../../
cp metrics/CodeBLEU/parser/my-languages.so build/
#make sure git-lfs installed like 'apt-get install git-lfs'
bash get_models.sh

for cuda11.0+,

pip install torch==1.7.0+cu110 torchvision==0.8.1+cu110 torchaudio===0.7.0 -f https://download.pytorch.org/whl/torch_stable.html

Preparing data

The dataset comes from CodeXGLUE.

mkdir data
cd data
pip install gdown
gdown https://drive.google.com/uc?export=download&id=1BBeHFlKoyanbxaqFJ6RRWlqpiokhDhY7
unzip data.zip
rm data.zip

Preparing local path

Direct WORKDIR, HUGGINGFACE_LOCALS in run.sh, run_few_shot.sh to your path.

Supported Models and Tasks (Full Finetune)

export MODEL_NAME=
export TASK=
export SUB_TASK=
# to run one task
bash run.sh $MODEL_NAME $TASK $SUB_TASK
# to run few shot
bash run_few_shot.sh $MODEL_NAME $TASK $SUB_TASK
# to run multi task
bash run_multi_task.sh

MODEL_NAME can be any one of ["roberta", "codebert", "graphcodebert", "unixcoder","t5","codet5","bart","plbart"].

TASK can be any one of ['summarize', 'translate', 'refine', 'generate', 'defect', 'clone']. (generate refers concode in codexglue, and we don't consider complete)

SUB_TASK can be in below

Category Dataset Task Sub_task(LANG) Type Category Description
C2C BCB clone [] (java) bi-directional encoder code summarization task onCodeSearchNet data with six PLs
C2C Devign defect [] (c) bi-directional encoder text-to-code generation onConcode data
C2C CodeTrans translate ['java-cs', 'cs-java’] end2end en2de code-to-code translation betweenJava and C#
C2C Bugs2Fix refine(repair) ['small','medium'] (java) end2end en2de code refinement oncode repair data with small/medium functions
C2T CodeSN summarize ['java', 'python', 'javascript','php','ruby','go'] end2end en2de code defect detection inC/C++ data
T2C CONCODE generate(concode) [] (java) end2end en2de code clone detection inJava data

Scripts

codePTMs full finetuning

run_full_finetuning.sh

parameter_efficient

run_adapter.sh

run_bitfit.sh

run_prefix_tuning.sh

fewshot learning

run_few_shot.sh

References

  1. Nuo Chen, Qiushi Sun, Jianing Wang, Xiang Li, Ming Gao: Pass-Tuning: Towards Structure-Aware Parameter-Efficient Tuning for Code Representation Learning. EMNLP 2023.
  2. Nuo Chen, Qiushi Sun, Jianing Wang, Xiang Li, Ming Gao: Evaluating and Enhancing the Robustness of Code Pre-trained Modelsthrough Structure-Aware Adversarial Samples Generation. EMNLP 2023.
  3. Nuo Chen, Qiushi Sun, Renyu Zhu, Xiang Li, Xuesong Lu, Ming Gao: CAT-probing: A Metric-based Approach to Interpret How Pre-trained Models for Programming Language Attend Code Structure. EMNLP 2022.

About

a unified and comprehensive code representation library

Resources

License

Stars

Watchers

Forks

Packages

No packages published