Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Negative JSELoss #9

Closed
HannesStark opened this issue Apr 19, 2021 · 1 comment
Closed

Negative JSELoss #9

HannesStark opened this issue Apr 19, 2021 · 1 comment
Labels
sslgraph Self-supervised Learning on Graphs

Comments

@HannesStark
Copy link
Contributor

HannesStark commented Apr 19, 2021

I was initially confused by getting a negative loss using the JSELoss and by the implementation here where the Jensen Shannon Divergence is shifted:

log_2 = np.log(2.)
if positive:
score = log_2 - F.softplus(-masked_d_prime)
else:
score = F.softplus(-masked_d_prime) + masked_d_prime - log_2
return score

To avoid others stumbling over this:
Apparently, this choice was made for consistency with other divergences, and it does not affect anything in training and model performance.
This issue in the Deep Infomax library explains why the shift is included rdevon/DIM#19

@mengliu1998 mengliu1998 added the sslgraph Self-supervised Learning on Graphs label Apr 19, 2021
@ycremar
Copy link
Collaborator

ycremar commented Apr 20, 2021

Thank you, @HannesStark !

@ycremar ycremar closed this as completed Apr 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sslgraph Self-supervised Learning on Graphs
Projects
None yet
Development

No branches or pull requests

3 participants