Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Thai2Rom on PyTorch (seq2seq no attention mechanism) #235

Merged
merged 7 commits into from
Jun 29, 2019

Conversation

c4n
Copy link
Contributor

@c4n c4n commented Jun 24, 2019

See issue: #202
This is the pytorch version of Thai2Rom (seq2seq no attention mechanism).
there is still room for improvement, but you guys can test it.

bact and others added 2 commits May 9, 2019 17:56
- Replace keras model with pytorch encoder-decoder LSTM model (seq2seq with no attention mechanism)
@pep8speaks
Copy link

pep8speaks commented Jun 24, 2019

Hello @c4n! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

Line 11:80: E501 line too long (227 > 79 characters)

Line 134:80: E501 line too long (95 > 79 characters)

Comment last updated at 2019-06-24 17:58:36 UTC

@c4n c4n changed the title Port Thai2Rom from Keras to PyTorch Thai2Rom on PyTorch (seq2seq no attention mechanism) Jun 25, 2019
@lalital lalital merged commit cb78c50 into PyThaiNLP:dev Jun 29, 2019
@wannaphong wannaphong added this to the 2.1 milestone Aug 17, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants