Skip to content

robPTY/pureNMT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

88 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pureNMT

This is an NMT model built with only Python and PyTorch tensors (no autograd, no nn.Module). It consists of an encoder-decoder architecture, using LSTMs as the backbone, with Bahdanau attention. All gradients of the LSTM are tested against autograd for correctness.

Roadmap

RNN

Full implementation of an RNN, using it as the starting point for the LSTM. The LSTM will serve as the transition towards learning Transformers.

  • Forward pass
  • Backward pass
  • Testing (manual gradients vs. autograd, loss vs. validation set)
  • Optimization of network (learning rate decay, expanding onto more layers, not pre-defining sequence length)

LSTM

Implemented 1997 Hochreiter LSTM to predict an n-length sequence of sun spots given an n-length input sequence. Since the results are hard to visualize, I also will implement Seq2Seq for translation between English and Spanish (since I can properly verify this).

  • Forward pass
  • Backward pass
  • Testing (manual gradients vs. autograd, training loss vs. validation loss)
  • Seq2Seq 2014 Paper Implementation (corpora size of 142,928 words)
  • Bahdanau Attention to improve translation accuracy for larger sentences

References

Across this project, I've probably used countless resources, but the most important ones so far are listed below

ML Concepts

About

NMT (English-Spanish) by hand (no autograd)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages