The aim of this repo is to explore all the different ways to perform text summarization.
The dataset used here is Amazon Reviews.
- Sequence to Sequence with attention
- Sequence to Sequence with attention with RNN weights pre-trained from WikiText-103. Weights can be found here
- AWD-LSTM for ASGD Weight-Dropped LSTM
- Pointer-Generator Networks with Coverage
- Pointer-Generator Networks with Coverage with RNN weights pre-trained from WikiText-103. Weights can be found here
This is still a working repo. Stay tuned...
- pytorch
- spacy
- multi-processing
- cyclical learning rates