-
Notifications
You must be signed in to change notification settings - Fork 238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About loss_c in Multibox Loss #14
Comments
@taneslle , HI, have you figured it out why? |
Because we want to get the softmax loss for each element. However, in the early version of Pytorch, this function is not supported and only the sum or average loss of all the elements is produced. |
@lzx1413 Thank you so much for your replying! Now, we can control the 'reduce=False" to get the element -wise loss. So, do you mean we can just use the |
You can calculate the softmax loss once and generate a mask for postive instances and hard negtive instances. Then you can multiply them together to generate the valuable loss. |
Yes. Another issue is that if we can do it by using the F.cross_entropy twice. Will it change the input tensor's gradient because the forward pass of the same input is tracked. Thank you for your help again! |
If you don't add the first softmax's loss to the final total loss, it will not affect the network. |
Thank you. |
Hi~
loss_c = log_sum_exp(batch_conf) - batch_conf.gather(1, conf_t.view(-1,1))
PytorchSSD/layers/modules/multibox_loss.py
Line 93 in cd5776d
This operation seems to me that same as to calculate the softmax cross entropy loss, so why not use
torch.nn.functional.cross_entropy
after softmax directly?ps.
+x_max
and-x_max
PytorchSSD/utils/box_utils.py
Line 274 in cd5776d
The text was updated successfully, but these errors were encountered: