Permalink
Switch branches/tags
Nothing to show
Find file Copy path
Fetching contributors…
Cannot retrieve contributors at this time
13 lines (8 sloc) 527 Bytes

A Neural Attention Model for Abstractive Sentence Summarization

TLDR; The authors apply a neural seq2seq model to sentence summarization. The model uses an attention mechanism (soft alignment).

Key Points

  • Summaries generated on the sentence level, not paragraph level
  • Summaries have fixed length output
  • Beam search decoder
  • Extractive tuning for scoring function to encourage the model to take words from the input sequence
  • Training data: Headline + first sentence pair.