Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

Combine KLDivergenceBCELoss with SoftHardBCELoss and F.cross_entropy() in CrossEntropyLoss #689

Closed
wants to merge 1 commit into from

Conversation

hikushalhere
Copy link
Contributor

Summary:

  1. By virtue of its name KLDivergenceBCELoss should take care of BCELoss with hard targets but it doesn't. On the other hand SoftHardBCELoss does exactly that. Combining the two.
  2. Using F.cross_entropy() in CrossEntropyLoss which is numerically more stable than doing log_sftmax() -> nll_loss().

Differential Revision: D15795206

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Jun 13, 2019
…) in CrossEntropyLoss (#689)

Summary:
Pull Request resolved: #689

1. By virtue of its name `KLDivergenceBCELoss` should take care of BCELoss with hard targets but it doesn't. On the other hand SoftHardBCELoss does exactly that. Combining the two.
2. Using `F.cross_entropy()` in CrossEntropyLoss which is numerically more stable than doing `log_sftmax()` -> `nll_loss()`.

Reviewed By: gardenia22

Differential Revision: D15795206

fbshipit-source-id: e358264b17f11b78daf49936b35c6a0b3a958fc6
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in d994b4a.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants