Tutorial on "Practical Neural Networks for NLP: From Theory to Code" at EMNLP 2016
Switch branches/tags
Nothing to show
Clone or download
neubig Merge pull request #3 from vicmak/patch-1
Train order, batches start points number change
Latest commit 08d0e95 Mar 5, 2017

README.md

Practical Neural Networks for NLP

A tutorial given by Chris Dyer, Yoav Goldberg, and Graham Neubig at EMNLP 2016 in Austin. The tutorial covers the basic of neural networks for NLP, and how to implement a variety of networks simply and efficiently in the DyNet toolkit.

  • Slides, part 1: Basics

    • Computation graphs and their construction
    • Neural networks in DyNet
    • Recurrent neural networks
    • Minibatching
    • Adding new differentiable functions
  • Slides, part 2: Case studies in NLP

    • Tagging with bidirectional RNNs and character-based embeddings
    • Transition-based dependency parsing
    • Structured prediction meets deep learning