-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model with modified data set reader #3
Comments
embedding usage? |
pass score and target per each example (batched) candidates can be only 1! 100% accuracy guaranteed by crosswiki strong candidates. is it possible to enforce the length of candidates be at least 2? (if 1 pad unknown WID as a faulty class) just add @@unknown@@ if a length is one and give probability zero prior
https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py log_softmax
softmax( score ) --> normalized btw 0 and 1
|
token vocab built from training data does not match pre-trained glove embedding. (non-english tokens)
still, load pre-trained embedding? |
forward receives
all are padded & batched tensors except meta field.
Use pre-trained embedding, GloVe
Embeddings with One Hot Vector from MultiLabels (match dimensions + 1 for UNK)
DenseSparseAdam encountered numerical error
initializer in config how?
regularizer in config how?
The text was updated successfully, but these errors were encountered: