Skip to content

Latest commit

 

History

History
24 lines (19 loc) · 908 Bytes

README.md

File metadata and controls

24 lines (19 loc) · 908 Bytes

Standard Fine-tuning

Thanks a lot for OpenNRE.

Preparation

All datasets need to be placed in the benchmark folder.

Running

>> python train_supervised_bert.py --dataset=SciERC/10-1 --pretrain_path=dmis-lab/biobert-large-cased-v1.1
  • --dataset: the directory of the dataset to be trained
  • --pretrain_path: PLM, defaulting to roberta-large

Balancing

  • Re-sampling datasets refers to README.
  • Re-weighting Loss: --use_loss

Data Augmentation

Self-training

  • Assign pseudo labels: --labeling True
  • Combine pseudo-labeled data and gold-labeled data by using self-train_combine.py
  • Train the student model: --stutrain True