Skip to content

Chia-Hsuan-Lee/Deeplearning-PaperNotes

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 

Repository files navigation

Deeplearning_PaperNotes

● Machine Comprehension / Question Anxswering

-Memory Module

Neural Turing Machines arxiv

End-To-End Memory Networks arxiv

Ask Me Anything: Dynamic Memory Networks for Natural Language Processing arxiv

Dynamic Memory Networks for Visual and Textual Question Answering arxiv

Tracking the World State with Recurrent Entity Networks arxiv

Learning to Skim Text arxiv

-Embedding Module

A Comparative Study of Word Embeddings for Reading Comprehension arxiv

A Structured Self-attentive Sentence Embedding arxiv

-TOEFL

Towards Machine Comprehension of Spoken Content: Initial TOEFL Listening Comprehension Test by Machine arxiv

Hierarchical Attention Model for Improved Machine Comprehension of Spoken Content arxiv

-Representation of words

Exploiting Similarities among Languages for Machine Translation arxiv

Distributed Representations of Sentences and Documents arxiv

Skip-Thought Vectors arxiv

Learning Context-Specific Word/Character Embeddings [AAAI] (https://aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14601/14266) [2017/2] ├ Learn Multi-Sense Vector for different meannings of single word, letting machine to deicide when to generate new sense

Learned in Translation: Contextualized Word Vectorsarxiv [2017/8] ├ Learn Word vectors by training on English-German Translation(ATTENTIONAL SeqtoSeq)

● Representation of Audio

Audio Word2Vec arxiv [2016/3]

Deep convolutional acoustic word embeddings using word-pair side information arxiv [2015/10]

Learning Latent Representations for Speech Generation and Transformation arxiv [2017/4]

● Text Generation

Controllable Text Generation arxiv [2017/3]

● Summary

Get To The Point: Summarization with Pointer-Generator Networks https://arxiv.org/pdf/1704.04368.pdf

ABSTRACTIVE HEADLINE GENERATION FOR SPOKEN CONTENT BY ATTENTIVE RECURRENT NEURAL NETWORKS WITH ASR ERROR MODELING https://arxiv.org/pdf/1612.08375.pdf

Order-Preserving Abstractive Summarization for Spoken Content Based on Connectionist Temporal Classification https://arxiv.org/pdf/1709.05475v2.pdf

A Deep Reinforced Model for Abstractive Summarization https://arxiv.org/abs/1705.04304

LEARNING TO ENCODE TEXT AS HUMAN-READABLE SUMMARIES USING GENERATIVE ADVERSARIAL NETWORKS  https://openreview.net/pdf?id=r1kNDlbCb

Diversity driven Attention Model for Query-based Abstractive Summarization arxiv [2017/4] ├ Solve the problem of redundant, repitetive words in summary. Use math-orthogonal to obtain attention vector.

● Generalized / Miscellaneous

Attention Is All You Need [arxiv] (https://arxiv.org/abs/1706.03762) [2017/6] ├ No Convolution and recurrence, pure attention. Use multi-head to grasp various attention of the inputs ( self-attention )

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published