Skip to content

tshi04/DMSC_FEDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DMSC_FEDA

Document Level Multi-Aspect Rating Prediction

image image image image image

This repository is a pytorch implementation for the following arxiv paper:

Tian Shi, Ping Wang, Chandan K. Reddy

Requirements

  • Python 3.6.9
  • argparse=1.1
  • torch=1.4.0
  • sklearn=0.22.2.post1
  • numpy=1.18.2

Dataset

Some data sources:

Please download processed dataset from here. Place them along side with DMSC_FEDA.

|--- DMSC_FEDA
|--- Data
|    |--- trip_rate
|    |--- trip_binary
|    |    |--- dev
|    |    |--- dev.bert
|    |    |--- glove_42B_300d.npy
|    |    |--- test
|    |    |--- test.bert
|    |    |--- train
|    |    |--- train.bert
|    |    |--- vocab
|    |    |--- vocab_glove_42B_300d
|--- nats_results (results, automatically build)

Usuage

Training, Validate, Testing python3 run.py --task train Testing only python3 run.py --task test Evaluation python3 run.py --task evaluate keywords Extraction python3 run.py --task keywords Attention Weight Visualization python3 run.py --task visualization

If you want to run baselines, you may need un-comment the corresponding line in run.py.

Baselines Implemented.

Model BRIEF
mtCNN CNN + Multi-Task Learning
mtRNN Bi-LSTM + Multi-Task Learning
mtAttn mtRNN + Self-Attention
mtBertAttn BERT + Multi-Task Learning + Self-Attention
mtAttnDA mtRNN + Deliberated Self-Attention
MtAttnFE mtAttn + Pretrained Embedding + Feature Enrichment
FEDA mtAttnDA + Pretrained Embedding + Feature Enrichment

Use Pretrained Model

Coming Soon.

Citation

@article{shi2020deliberate,
  title={Deliberate Self-Attention Network with Uncertainty Estimation for Multi-Aspect Review Rating Prediction},
  author={Shi, Tian and Wang, Ping and Reddy, Chandan K},
  journal={arXiv preprint arXiv:2009.09112},
  year={2020}
}

About

Document Level Multi-Aspect Rating Prediction

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages