Skip to content

Source code of paper "Efficient NLP Model Finetuning via Multistage Data Filtering" (IJCAI 2023)

Notifications You must be signed in to change notification settings

HarperCy/efficient-NLP-multistage-training

 
 

Repository files navigation

Efficient-NLP-multistage-training

Source code of the IJCAI 2023 paper: "Efficient NLP Model Finetuning via Multistage Data Filtering"

Main Organization of the Code

We provide the three stage training python scripts for glue/amazon/ag news datasets. dataset.py is for data preprocessing.

Reference

If you find the code useful, please cite the following papers:

Efficient NLP Model Finetuning via Multistage Data Filtering. Xu Ouyang, Shahina Mohd Azam Ansari, Felix Xiaozhu Lin, Yangfeng Ji. the 32nd International Joint Conference on Artificial Intelligence (IJCAI 2023)

@inproceedings{ouyang2023efficient,
  title={Efficient NLP Model Finetuning via Multistage Data Filtering},
  author={Ouyang, X and Ansari, S and Lin, F and Ji, Y},
  booktitle={INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE},
  year={2023}
}

About

Source code of paper "Efficient NLP Model Finetuning via Multistage Data Filtering" (IJCAI 2023)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages

  • Python 100.0%