Skip to content

Try various attention mechanism(inter-attenion & self-attention & hierarchical-attention) on sentiment analysis task.

Notifications You must be signed in to change notification settings

PerpetualSmile/Sentiment-Analylsis-based-on-Attention-Mechanism

Repository files navigation

Sentiment-Analylsis-based-on-Attention-Mechanism

Introduce

  • We tried various attention models on sentiment analysis task, such as InterAttention-BiLSTM, Transformer(Self-Attention), Self-Attention&Inter-Attention-BiLSTM, HAN.

  • We proposed TransformerForClassification model which only needs attention mechanism and does not contain any RNN architecture.

  • We trained and tested our models on both English and Chinese sentiment analysis dataset.

  • We intuitively proved the reasonability and power of attention mechanism by attention visualization.

  • We crawled our own Chinese movie review dataset and made it public.

Model implemented in this repository

  • Inter-Attention BiLSTM

Bahdanau, Dzmitry, Kyunghyun Cho, and Yoshua Bengio. "Neural machine translation by jointly learning to align and translate." arXiv preprint arXiv:1409.0473 (2014).

  • Transformer for classification

Vaswani, Ashish, et al. "Attention is all you need." Advances in Neural Information Processing Systems. 2017.

  • Self-Attention & Inter-Attention BiLSTM
  • Hierarchical Attention Network

Yang, Zichao, et al. "Hierarchical attention networks for document classification." Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2016.

Model Architecture

  • Inter-Attention BiLSTM

  • Transformer

  • Self-Attention & Inter-Attention BiLSTM

  • Hierarchical Attention Network

Attention Visualization

Hierarchical Attention Visualization

About

Try various attention mechanism(inter-attenion & self-attention & hierarchical-attention) on sentiment analysis task.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages