-
Notifications
You must be signed in to change notification settings - Fork 0
Nirvanabc/seq2seq-rnn
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
rewrite num_words to be different for two languages my_keras.py -- some helpful functions to use keras It predicts only start start start ... either after random word2vec or after several hours of training on real word2vec with two dictionaries. write decorator to switch between random and real types of learning 1. You should read this: https://arxiv.org/pdf/1609.08144.pdf about google seq2seq 2. https://www.kaggle.com/lystdo/lstm-with-word2vec-embeddings/code LSTM word2vec classification 3. https://github.com/keras-team/keras/blob/master/examples/lstm_seq2seq.py seq2seq char level 4. https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/21_Machine_Translation.ipynb a good tutorial from tf and keras (there are many tutorials there) and with explanation of some keras mistakes about sparse_cross_entropy 5. https://github.com/keras-team/keras/blob/master/examples/lstm_seq2seq.py -- lstm mashine translation (unfortunately, character level)
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published