You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just want to make sure. Are we meant to change the original setting values 17, 35 for ATOMIC to bigger values in order to feed with a longer sentence than 18 words?
Or can we feed with a long sentence?
The text was updated successfully, but these errors were encountered:
XMB = torch.cat((torch.LongTensor(prefix).view( 1, len(prefix)), torch.LongTensor([text_encoder.encoder["<{}>".format(category)]]).view(1,1)), dim =1) will help to replace the XMB initialisation.
You'd have to change the setting and retrain the model. Training a model with the original settings and then giving sequences longer than 18 words will lead to unstable behavior.
Just want to make sure. Are we meant to change the original setting values 17, 35 for ATOMIC to bigger values in order to feed with a longer sentence than 18 words?
Or can we feed with a long sentence?
The text was updated successfully, but these errors were encountered: