We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The pytorch document shows:
class torch.nn.CrossEntropyLoss(weight=None, size_average=True)[source] This criterion combines LogSoftMax and NLLLoss in one single class.
SO could I remove softmax layer from the model sequential?
softmax
model.add_module("softmax", torch.nn.Softmax())
The text was updated successfully, but these errors were encountered:
No need to use the softmax layer since CrossEntropyLoss already uses …
f32192e
…it internally (issue #1).
I've checked CrossEntropyLoss and found that it computes softmax internally. So, the softmax layer is unnecessary. Thanks for pointing that out.
Sorry, something went wrong.
do you have a line reference for this in the pytorch documentation?
No branches or pull requests
The pytorch document shows:
SO could I remove
softmax
layer from the model sequential?The text was updated successfully, but these errors were encountered: