This is the source code of SCKD model.
- Python (tested on 3.7.4)
- CUDA (tested on 10.2)
- PyTorch (tested on 1.7.1)
- Transformers (tested on 2.11.0)
- numpy
- opt-einsum (tested on 3.3.0)
- tqdm
- sklearn
- scipy (tested on 1.5.2)
Download pretrained language model from huggingface and put it into the ./bert-base-uncased
directory.
Train the SCKD model on FewRel dataset under 10-way-5-shot (10-way-10-shot) setting with the following command:
>> python main.py --task FewRel --shot 5 # for 10-way-5-shot setting
>> python main.py --task FewRel --shot 10 # for 10-way-10-shot setting
Train the SCKD model on TACRED dataset under 5-way-5-shot (5-way-10-shot) setting with the following command:
>> python main.py --task tacred --shot 5 # for 5-way-5-shot setting
>> python main.py --task tacred --shot 10 # for 5-way-10-shot setting
If you find the repository helpful, please cite the following paper.
@misc{wang2023serial,
title={Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction},
author={Xinyi Wang and Zitao Wang and Wei Hu},
year={2023},
eprint={2305.06616},
archivePrefix={arXiv},
primaryClass={cs.CL}
}