We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pytorch-widedeep/pytorch_widedeep/models/transformers/_embeddings_layers.py
Line 15 in c287c87
The text was updated successfully, but these errors were encountered:
Fixed #60 and #61. Placing dropout in the right position and applying…
3532a98
… FullEmbeddingDropout only in training
@jmpark-swk just in case, this has not been fixed in the branch https://github.com/jrzaurin/pytorch-widedeep/tree/pmulinka/uncertainty
Soon to be merged to master :)
Sorry, something went wrong.
jrzaurin
No branches or pull requests
pytorch-widedeep/pytorch_widedeep/models/transformers/_embeddings_layers.py
Line 15 in c287c87
#60
There is a bug in which FullEmbeddingDropout operates in both train and evaluation.
The text was updated successfully, but these errors were encountered: