Skip to content
No description, website, or topics provided.
Python Shell
Branch: master
Clone or download
Latest commit f1fffe0 Jul 26, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
README.md fix Jul 26, 2019
file_utils.py baseline Jul 20, 2019
modeling_utils.py baseline Jul 20, 2019
modeling_xlnet.py test Jul 24, 2019
optimization.py baseline Jul 20, 2019
run.sh fix Jul 26, 2019
run_xlnet_dream.py fix Jul 26, 2019
test_xlnet_dream.py test Jul 24, 2019
tokenization_utils.py baseline Jul 20, 2019
tokenization_xlnet.py baseline Jul 20, 2019

README.md

XLNet baseline for DREAM dataset

Author: Chenglei Si (River Valley High School, Singapore)

Update: Sometimes you may get degenerate runs where the performance is far lower than the expected performance. This is mainly because the training is not stable on smaller datasets. You may try to change the random seeds (and perhaps learning rate, batch size, warmup steps or other hyperparameters as well) and restart training. If you want, I can send you a trained checkpoint. Feel free to contact me through email: sichenglei1125@gmail.com
Note: You should use the dev set to do hyper-parameter tuning and then use the test file and trained model to evaluate on the test data. This is the standard practice for ML.

Usage:

  1. Download data and unzip to this folder.
  2. (If you have not installed sentencepiece) Run pip install sentencepiece
  3. Run sh run.sh
  4. To test a trained model, Run python test_xlnet_dream.py --data_dir=data --xlnet_model=xlnet-large-cased --output_dir=xlnet_dream --checkpoint_name=pytorch_model_3epoch_72_len256.bin --max_seq_length=256 --do_eval --eval_batch_size=1 You may need to change the checkpint name accordingly.

(The hyperparameters that I used can be found in run.sh)

Result: 72.0 (SOTA as of July 2019, leaderboard)

Note: My codes are built upon huggingface's implementation of pytorch_transformers, and the original XLNet paper is: (Yang et al., 2019).

You can’t perform that action at this time.