Neural Machine Translation with Attention (Dynet)
-
Updated
Feb 26, 2017 - Python
Neural Machine Translation with Attention (Dynet)
BiLSTM-CRF for sequence labeling in Dynet
DyNet implementation of stack LSTM experiments by Grefenstette et al.
Selective Encoding for Abstractive Sentence Summarization in DyNet
Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
Convolutional Neural Networks for Sentence Classification in DyNet
Code that exemplifies neural network solutions for classification tasks with DyNet. On top of that, the code demonstrates how to implement a custom classifier that is compatible with scikit-learn's API.
A Neural Attention Model for Abstractive Sentence Summarization in DyNet
A neural conditional random field implemented in DyNet.
Source code for the paper "Morphological Inflection Generation with Hard Monotonic Attention"
a simple modification of Chris Dyer's stack LSTM Parser
Code for paper "End-to-End Reinforcement Learning for Automatic Taxonomy Induction", ACL 2018
An attentional NMT model in Dynet
PoS Tagging with Bidirectional Long Short-Term Memory Models
Source code for an ACL2017 paper on Chinese word segmentation
Source code for an ACL2016 paper of Chinese word segmentation
miRNA subcellular localization
Add a description, image, and links to the dynet topic page so that developers can more easily learn about it.
To associate your repository with the dynet topic, visit your repo's landing page and select "manage topics."