Re-examining lexical and semantic attention: Dual-view graph convolutions enhanced BERT for academic paper rating
- pytorch >= 1.9.0
- numpy >= 1.13.3
- sklearn
- python 3.9
- transformers
Original dataset:
- AAPR: https://github.com/lancopku/AAPR
- PeerRead: https://github.com/allenai/PeerRead
Download the dealt dataset from https://drive.google.com/file/d/1UWQzGYuxL53PjNY6wmcknpwhllEEWCUl/view?usp=sharing
You can get the SciBERT model here https://github.com/allenai/scibert
And put the SciBERT model in ./bert/scibert/
You can get the pre-trained state_dicts here https://drive.google.com/file/d/13Inl_ChtY0LBCp9D0wjFW1yFdJPvRMep/view?usp=sharing
# SciBERT
python main.py --phase SciBERT --data_source AAPR/PeerRead --type SciBERT
# DGCBERT
python main.py --phase DGCBERT --data_source AAPR/PeerRead --type SciBERT --mode top_biaffine+softmax --k X --alpha X --top_rate X --predict_dim X
# SciBERT
python main.py --phase model_test --data_source AAPR/PeerRead --model SciBERT
# DGCBERT
python main.py --phase model_test --data_source AAPR/PeerRead --model DGCBERT
@article{xue2023re,
title={Re-examining lexical and semantic attention: Dual-view graph convolutions enhanced BERT for academic paper rating},
author={Xue, Zhikai and He, Guoxiu and Liu, Jiawei and Jiang, Zhuoren and Zhao, Star and Lu, Wei},
journal={Information Processing \& Management},
volume={60},
number={2},
pages={103216},
year={2023},
publisher={Elsevier}
}