Abstractive text summarization summarizes the text maintaining coherent information in a similar amount of words as human generated summary. In the report, we briefly describe the abstractive text summarization task and several methods used to predict the summary in a concise way. Notebook file fine-tunes T5 transfer learning model on CNN/DailyMail dataset and achieves a strong ROUGE-1 unigram measure of 44% and ROUGE-2 bigram of 14%.
Dataset: https://www.kaggle.com/gowrishankarp/newspaper-text-summarization-cnn-dailymail