Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Saving Training model and then running it on test dataset #6

Closed
spartian opened this issue Jun 4, 2019 · 4 comments
Closed

Saving Training model and then running it on test dataset #6

spartian opened this issue Jun 4, 2019 · 4 comments

Comments

@spartian
Copy link

spartian commented Jun 4, 2019

Hello,
As I understand, there has to be(very often), training set,validation set, test set. Machine learning model trains on training set and the parameters are tuned at validation set. This model is then saved and test set is then applied at that model. However going through, hatt.py, I saw the model being saved as checkpoint. But I did not find any implementation of using that model for test set. Am I right or have i missed something in the code? The model should be saved and then applied to test set. In your case, did you not use the testing set?

@AlexGidiotis
Copy link
Owner

You haven't missed something. It looks like I have not uploaded the eval code (probably I forgot to). Although, it should be straightforward to run on a test set.

@spartian
Copy link
Author

spartian commented Jun 7, 2019

Another thing...Have you implemented multi layer perceptron with a single layer? The original paper says it uses MLP to get hidden representation of hit i.e. uit (annotation of words)

@AlexGidiotis
Copy link
Owner

This is implemented in the attention layer for word and sentence attentions.

@spartian
Copy link
Author

spartian commented Jun 7, 2019

Ok.....I think I missed the detail in attention context ....will definitely look into it.

@spartian spartian closed this as completed Jun 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants