Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consistency loss #5

Closed
MaekTec opened this issue Jul 18, 2022 · 1 comment
Closed

Consistency loss #5

MaekTec opened this issue Jul 18, 2022 · 1 comment

Comments

@MaekTec
Copy link

MaekTec commented Jul 18, 2022

Hi, thanks for your great work! I have a question about the consistency loss between the teacher and the student on the unlabeled points. In the code you used the KL-divergence, but in the paper (formula 3) it's something different. For me formula 3 looks like a soft version of cross-entropy, but the minus sign is missing. Or should it be the KL-divergence (https://pytorch.org/docs/stable/generated/torch.nn.KLDivLoss.html) and you forgot some part of it? Or am I missing something?

@ouenal
Copy link
Owner

ouenal commented Jul 18, 2022

That is a good point to bring up thank you. I haven't realized it yet but it should be KL divergence. I'm a bit baffled because I don't know how I've managed to write that loss in the paper.

To clarify, I've always used KL divergence for the unsupervised loss so the codebase is correct.

@ouenal ouenal closed this as completed Jul 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants