Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About CrossEntroyLoss #1

Closed
SmartAI opened this issue Feb 20, 2017 · 2 comments
Closed

About CrossEntroyLoss #1

SmartAI opened this issue Feb 20, 2017 · 2 comments

Comments

@SmartAI
Copy link

SmartAI commented Feb 20, 2017

The pytorch document shows:

class torch.nn.CrossEntropyLoss(weight=None, size_average=True)[source]
This criterion combines LogSoftMax and NLLLoss in one single class.

SO could I remove softmax layer from the model sequential?

model.add_module("softmax", torch.nn.Softmax())
vinhkhuc added a commit that referenced this issue Feb 26, 2017
@vinhkhuc
Copy link
Owner

I've checked CrossEntropyLoss and found that it computes softmax internally. So, the softmax layer is unnecessary. Thanks for pointing that out.

@brando90
Copy link

brando90 commented Jan 20, 2018

do you have a line reference for this in the pytorch documentation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants