Skip to content

Re-examining lexical and semantic attention: Dual-view graph convolutions enhanced BERT for academic paper rating

License

Notifications You must be signed in to change notification settings

ECNU-Text-Computing/DGC-BERT

Repository files navigation

DGCBERT

Re-examining lexical and semantic attention: Dual-view graph convolutions enhanced BERT for academic paper rating

Requirement

  • pytorch >= 1.9.0
  • numpy >= 1.13.3
  • sklearn
  • python 3.9
  • transformers

Usage

Original dataset:

Download the dealt dataset from https://drive.google.com/file/d/1UWQzGYuxL53PjNY6wmcknpwhllEEWCUl/view?usp=sharing

You can get the SciBERT model here https://github.com/allenai/scibert
And put the SciBERT model in ./bert/scibert/
You can get the pre-trained state_dicts here https://drive.google.com/file/d/13Inl_ChtY0LBCp9D0wjFW1yFdJPvRMep/view?usp=sharing

Training

# SciBERT
python main.py --phase SciBERT --data_source AAPR/PeerRead --type SciBERT
# DGCBERT
python main.py --phase DGCBERT --data_source AAPR/PeerRead --type SciBERT --mode top_biaffine+softmax --k X --alpha X --top_rate X --predict_dim X

Testing

# SciBERT
python main.py --phase model_test --data_source AAPR/PeerRead --model SciBERT
# DGCBERT
python main.py --phase model_test --data_source AAPR/PeerRead --model DGCBERT

Bibtex

@article{xue2023re,
  title={Re-examining lexical and semantic attention: Dual-view graph convolutions enhanced BERT for academic paper rating},
  author={Xue, Zhikai and He, Guoxiu and Liu, Jiawei and Jiang, Zhuoren and Zhao, Star and Lu, Wei},
  journal={Information Processing \& Management},
  volume={60},
  number={2},
  pages={103216},
  year={2023},
  publisher={Elsevier}
}

About

Re-examining lexical and semantic attention: Dual-view graph convolutions enhanced BERT for academic paper rating

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages