Neural Machine Translation with Attention (Dynet)
-
Updated
Feb 26, 2017 - Python
Neural Machine Translation with Attention (Dynet)
Transition-based joint syntactic dependency parser and semantic role labeler using a stack LSTM RNN architecture.
BiLSTM-CRF for sequence labeling in Dynet
DyNet implementation of stack LSTM experiments by Grefenstette et al.
Selective Encoding for Abstractive Sentence Summarization in DyNet
Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
Convolutional Neural Networks for Sentence Classification in DyNet
Code that exemplifies neural network solutions for classification tasks with DyNet. On top of that, the code demonstrates how to implement a custom classifier that is compatible with scikit-learn's API.
A Neural Attention Model for Abstractive Sentence Summarization in DyNet
A neural conditional random field implemented in DyNet.
Source code for the paper "Morphological Inflection Generation with Hard Monotonic Attention"
See http://github.com/onurgu/joint-ner-and-md-tagger This repository is basically a Bi-LSTM based sequence tagger in both Tensorflow and Dynet which can utilize several sources of information about each word unit like word embeddings, character based embeddings and morphological tags from an FST to obtain the representation for that specific wor…
a simple modification of Chris Dyer's stack LSTM Parser
Code for paper "End-to-End Reinforcement Learning for Automatic Taxonomy Induction", ACL 2018
Add a description, image, and links to the dynet topic page so that developers can more easily learn about it.
To associate your repository with the dynet topic, visit your repo's landing page and select "manage topics."