Skip to content

chen700564/supercd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SuperCD

Quick links

Environment

conda create -n supercd python=3.9.0
conda activate supercd
bash env.sh

Model

The pre-trained models are in huggingface: SIR and CE

Active Learning

You can run:

python main.py --output_dir output_dir \
--dataset ${dataset} \
--plm bert-base-uncased \
--plmpath bert-base-uncased \
--modelname tagmodel \
--per_device_train_batch_size 4 \
--do_train \
--shot 5 \
--maxshot 5 \
--save_strategy no \
--num_train_epochs 10 \
--learning_rate 1e-4 \
--warmup_ratio 0.1 \
--active supercd \
--save_total_limit 1 

The result will be in output_dir. You can change the shot for different shot and maxshot is the additional shot for active learning.

For different pre-trained model, you should change plm and plmpath.

For different base model, you can change modelname (tagmodel, structshot, proto, sdnet or container)

num_train_epochs is set to 50 for sdnet and 10 for other models.

learning_rate is set to 5e-5 for container and 1e-4 for other models.

License

The code is released under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Public License for Noncommercial use only. Any commercial use should get formal permission first.

Shield: CC BY-NC-SA 4.0

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

CC BY-NC-SA 4.0

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published