Team Member:
- Tianyu Liu,
- Deqiang Huang,
- Guoqing Zhao (All of us are students of USTC.)
- All the data and pretrained model are included into
data
folder andexp/NBFNet/CCKS
folder. - Preparing conda environment by running these commands:
# conda install
conda install pytorch=1.8.0 cudatoolkit=11.1 pyg -c pytorch -c pyg -c conda-forge
conda install ninja easydict pyyaml -c conda-forge
# pip install
pip install torch==1.8.0+cu111 -f https://download.pytorch.org/whl/torch_stable.html
pip install torch-scatter==2.0.8 torch-sparse==0.6.12 torch-geometric -f https://data.pyg.org/whl/torch-1.8.0+cu111.html
pip install ninja easydict pyyaml
To train a model, just run this command:
python script/run.py -c config/inductive/ccks.yaml --gpus [0]
All the training hyper-parameters are stored in config/inductive/ccks.yaml
. Feel free to change them to get different results.
To inference with the test data and generate submission files, just run this command:
python script/inference.py -c config/inductive/ccks.yaml --gpus [0]
This will generate a test.json
file in the experiment folder (given by ccks.yaml
).
A scores.pt
file will be output in the experiment folder as well, which represents the prediction of a specific model.
We perform grid search for best hyper-parameters. Each experiment has a unique config file (in config/inductive/grid_search
). Then we perform stacking ensemble to get best results.
To reproduce the best results, just run:
python script/ensemble.py
This will generate a test.json
file in the current path.
We use the official codebase of NBFNet, thanks for their contribution.