Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The issue for the loss of regression tasks #45

Closed
ChuanMeng opened this issue Dec 6, 2022 · 1 comment
Closed

The issue for the loss of regression tasks #45

ChuanMeng opened this issue Dec 6, 2022 · 1 comment

Comments

@ChuanMeng
Copy link

ChuanMeng commented Dec 6, 2022

loss_fct = nn.KLDivLoss(log_target=True)

Hi Tianyu,
Thank you for releasing the code.
I found one problem with the loss of regression tasks.

"logits" is through the operation "logsoftmax" and is in [-infinite, 0], while "labels" is not through that operation and is always greater than 0. They are not in the same space, so I think here log_target cannot be True. Or the "labels" should be operated by "torch.log(labels+very small number)".

What do you think of it?

Look forward to your reply.

Best wishes,
Chuan

@gaotianyu1350
Copy link
Member

Hi Chuan,

Thanks for your interest in our work! It seems that you are right, and I have no idea why the wrong version actually worked! Please let me know if you try the other way and it works better!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants