Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

F.log_softmax(x, dim=1) output is not probability? #16

Open
xypan1232 opened this issue Aug 31, 2018 · 6 comments
Open

F.log_softmax(x, dim=1) output is not probability? #16

xypan1232 opened this issue Aug 31, 2018 · 6 comments

Comments

@xypan1232
Copy link

Hi,

calling output = model(features, adj) does not give probability output? if I want model to return probability, what should I change?
If I change log_softmax to softmax, the loss function F.nll_loss should be changed?
thanks.

@tkipf
Copy link
Owner

tkipf commented Aug 31, 2018 via email

@roireshef
Copy link

Or use cross entropy loss...?

@tkipf
Copy link
Owner

tkipf commented Feb 13, 2019 via email

@roireshef
Copy link

@tkipf I meant this loss: https://pytorch.org/docs/0.3.1/nn.html#torch.nn.CrossEntropyLoss
AFAIK it takes probabilities.

@tkipf
Copy link
Owner

tkipf commented Feb 13, 2019 via email

@NinaM31
Copy link

NinaM31 commented Apr 25, 2021

Log_softmax applies log to the Softmax. In pytorch torch.log is by default the natural log.
you can think of it like this:

below is what you are getting, y_ hat is the result of the softmax and output is the result of Log_softmax

With respect to laws of Logarithms.

output = torch.exp(model(features, adj)) taking torch.exp will give you results as probabilities.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants