You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, thank you very much for your work. In EQL, the cross entropy loss is used in the calculation of background categories. In eqlv2, how to deal with the background category.
and
How to try to understand this sentence :“one task with positive label and C−1 tasks with negative labels are introduced by a single instance.”
look forward to your reply !
The text was updated successfully, but these errors were encountered:
Suppose we have C=5 categories.
In softmax loss, the classifier output c+1 channel, the extra 1 is for background. In inference, the background scores will suppress other categories' scores since we have a softmax operation.
In sigmoid loss, the classifier output c channel. Each channel is then activated by a sigmoid function and responsible for one category. So each channel needs a gt_label (0, 1) in training.
Given a proposal of category j, for example j=2, we re-write the gt_label to [0, 0, 1, 0, 0]. That is: one instance introduces one task with positive label and C−1 tasks with negative labels.
For background task, the gt_label is [0, 0, 0, 0, 0]
Hello, thank you very much for your work. In EQL, the cross entropy loss is used in the calculation of background categories. In eqlv2, how to deal with the background category.
and
How to try to understand this sentence :“one task with positive label and C−1 tasks with negative labels are introduced by a single instance.”
look forward to your reply !
The text was updated successfully, but these errors were encountered: