Skip to content

Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction, Findings of ACL 2023

License

Notifications You must be signed in to change notification settings

nju-websoft/SCKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SCKD

This is the source code of SCKD model.

Requirements

  • Python (tested on 3.7.4)
  • CUDA (tested on 10.2)
  • PyTorch (tested on 1.7.1)
  • Transformers (tested on 2.11.0)
  • numpy
  • opt-einsum (tested on 3.3.0)
  • tqdm
  • sklearn
  • scipy (tested on 1.5.2)

Pretrained models

Download pretrained language model from huggingface and put it into the ./bert-base-uncased directory.

Run

FewRel

Train the SCKD model on FewRel dataset under 10-way-5-shot (10-way-10-shot) setting with the following command:

>> python main.py --task FewRel --shot 5  # for 10-way-5-shot setting
>> python main.py --task FewRel --shot 10 # for 10-way-10-shot setting 

TACRED

Train the SCKD model on TACRED dataset under 5-way-5-shot (5-way-10-shot) setting with the following command:

>> python main.py --task tacred --shot 5  # for 5-way-5-shot setting
>> python main.py --task tacred --shot 10  # for 5-way-10-shot setting

Citation

If you find the repository helpful, please cite the following paper.

@misc{wang2023serial,
      title={Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction}, 
      author={Xinyi Wang and Zitao Wang and Wei Hu},
      year={2023},
      eprint={2305.06616},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction, Findings of ACL 2023

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages