Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feeding a bigger size of the input event than 18. #25

Closed
terryyz opened this issue Jun 24, 2020 · 2 comments
Closed

Feeding a bigger size of the input event than 18. #25

terryyz opened this issue Jun 24, 2020 · 2 comments

Comments

@terryyz
Copy link

terryyz commented Jun 24, 2020

Just want to make sure. Are we meant to change the original setting values 17, 35 for ATOMIC to bigger values in order to feed with a longer sentence than 18 words?

Or can we feed with a long sentence?

@terryyz
Copy link
Author

terryyz commented Jun 27, 2020

XMB = torch.cat((torch.LongTensor(prefix).view( 1, len(prefix)), torch.LongTensor([text_encoder.encoder["<{}>".format(category)]]).view(1,1)), dim =1) will help to replace the XMB initialisation.

@atcbosselut
Copy link
Owner

You'd have to change the setting and retrain the model. Training a model with the original settings and then giving sequences longer than 18 words will lead to unstable behavior.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants