Links to a curated list of awesome implementations of models.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.


  • Links to a curated list of awesome implementations of neural network models.(tensorflow,torch,theano,keras,...)
  • Mainly Question Answering,Machine comprehension,Sentiment Analysis...
  • Contributions are welcomed.

##Table of Contents

##Python - [context2vec: Learning Generic Context Embedding with Bidirectional LSTM]( - [Deep Unordered Composition Rivals Syntactic Methods for Text Classification(Deep Averaging Networks ACL2015)]( ##Tensorflow - [Neural Turing Machine(NMT)]( Kim’s(Tensorflow) - [Neural Turing Machine(NMT)]( Kai Sheng Tai’s (Torch) - [Neural Turing Machine(NMT)]( Tan’s (Thenao) - [Neural Turing Machine(NMT)](’s (Go) - [Neural Turing Machine(NMT)](’s (Lasagne) - [Neural GPUs Learn Algorithms]( - [A Neural Attention Model for Abstractive Summarization]( - [Recurrent Convolutional Memory Network]( - [End-To-End Memory Network]( - [End-To-End Memory Network]( - [Neural Variational Inference for Text Processing]([wikiQA Corpus]() - [Word2Vec]( - [CNN code for insurance QA(question Answer matching)]([InsuranceQA Corpus]( - [Some experiments on MovieQA with Hsieh,Tom and Huang in AMLDS]( - [Teaching Machines to Read and Comprehend]( - [Convolutional Neural Networks for Sentence Classification (kIM.EMNLP2014)]( - [Convolutional Neural Networks for Sentence Classification (kIM.EMNLP2014)]( - [Separating Answers from Queries for Neural Reading Comprehension]( - [Neural Associative Memory for Dual-Sequence Modeling]( - [The Ubuntu Dialogue Corpus: A Large Dataset for Research in Unstructured Multi-Turn Dialogue Systems.]( - [Key-Value Memory Networks for Directly Reading Documents]( - [A statistical natural language generator for spoken dialogue systems(SIGDIAL 2016 short paper)]( ##Theano - [ End-To-End Memory Networks, formerly known as Weakly Supervised Memory Networks]( - [Memory Networks]( - [Dynamic Memory Networks]( - [Ask Me Anything: Dynamic Memory Networks for Natural Language Processing](’s (Theano) - [Memory Networks](’s (Torch/Matlab) - [Recurrent Neural Networks with External Memory for Language Understanding]( - [Attention Sum Reader model as presented in "Text Comprehension with the Attention Sum Reader Network"]([ CNN and Daily Mail news data QA]() - [character-level language models]( - [Hierarchical Encoder-Decoder]( - [A Recurrent Latent Variable Model for Sequential Data]( - [A Fast Unified Model for Sentence Parsing and Understanding(Stack-augmented Parser-Interpreter Neural Network)]( - [ Semi-supervised Question Retrieval with Gated Convolutions. NAACL 2016]( - [ Molding CNNs for text: non-linear, non-consecutive convolutions. EMNLP 2015]( - [Tree RNNs]( - [A Character-Level Decoder without Explicit Segmentation for Neural Machine Translation(ACL2016)]( - [Charagram: Embedding Words and Sentences via Character n-grams]( - [Towards Universal Paraphrastic Sentence Embeddings]( - [Dependency-based Convolutional Neural Networks for Sentence Embedding]( - [Siamese-LSTM - Siamese Recurrent Neural network with LSTM for evaluating semantic similarity between sentences.(AAAI2016))]( ##Keras - [Learning text representation using recurrent convolutional neural network with highway layers]( ##Torch - [Sequence-to-sequence model with LSTM encoder/decoders and attention]( - [Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks]( - [Recurrent Memory Network for Language Modeling]( - [Bag of Tricks for Efficient Text Classification.(FastText)]( - [Bag of Tricks for Efficient Text Classification.(FastText)]( C++ - [Character-Aware Neural Language Models (AAAI 2016).]( - [Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks(Tree-LSTM)]( - [A Neural Attention Model for Abstractive Summarization.]( - [Text Understanding with the Attention Sum Reader Network, Kadlec et al., ACL 2016.]( - [A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task, Chen et al., ACL 2016.]( - [The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations, Hill et al., ICLR 2016.]( ##Matlab - [When Are Tree Structures Necessary for Deep Learning of Representations]( ##Deep Reinforcement Learning

##=========================================== ##machine learning and deep learning tutorials, articles and other resources

##People -[carpedm20](