Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

could you explain why you are using binary cross-entropy #8

Open
xysong1201 opened this issue Mar 6, 2020 · 1 comment
Open

could you explain why you are using binary cross-entropy #8

xysong1201 opened this issue Mar 6, 2020 · 1 comment

Comments

@xysong1201
Copy link

Hi, Inside the CBloss you are using binary cross-entropy, so why not using cross-entropy ? Could you explain? Thank you

@BenDrewry
Copy link

From source: https://arxiv.org/pdf/1901.05555.pdf
"Note that β = 0 corresponds to
no re-weighting and β → 1 corresponds to re-weighing by
inverse class frequency"

It looks like they are using binary CE here to determine which class needs to be scaled appropriately based on the number of effective samples. So one class will be a 1, while the rest are 0s.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants