Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issue on BinaryFocalLoss #12

Open
quancore opened this issue May 20, 2020 · 1 comment
Open

issue on BinaryFocalLoss #12

quancore opened this issue May 20, 2020 · 1 comment

Comments

@quancore
Copy link

quancore commented May 20, 2020

I have two questions:

  1. Why you are applying sigmoid at the beginning of fow$rward call?
  2. Is label smoothing is correct? Should we apply to target rather than output?
  3. Why you are not dividing neg_loss by num_neg when num_pos == 0?
@shahzad-ali
Copy link

shahzad-ali commented Oct 8, 2020

This code has several deficiencies and you have highlighted some of those.

  1. Why you are applying sigmoid at the beginning of forward call?

There is no harm in assuming that the model @Hsuxu used gave logits as output (i.e. no sigmoid/softmax function was used) so the first thing one will do before calculating any loss is to use an activation function.

Let's hope @Hsuxu will answer the rest of the question soon!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants