Skip to content

icml-2020-nlp/semsim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 

Repository files navigation

Learning by Semantic Similarity Makes Abstractive Summarization Better

Author's note

The initial version of the manuscript includes a model design (semsim), experimental results on our model, BART model, and the reference summaries (both automatic evaluation metric and human evaluation metric), and discussions on the results. After we archived the manuscript, we found that our model has flaws in its implementation and design.

The final version of the manuscript is from the rest of the initial paper; we included our findings on the benchmark dataset, BART generated results and human evaluations, and we excluded our model semsim.

Folder description

/--|datasets/
   |results/
   |README.md
   |model.jpg
  • /datasets : Our version of the pre-processed CNN/DM dataset and the pre-processing code. Modified from PGN by See et al. following instructions of BART (issue #1391)
  • /results : We provide summarization results for the CNN/DM dataset and the reduced dataset (n=1000). Folder contains generated summaries of BART and SemSim and reference summaries (not tokenized).

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages