Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add hard_cross_entropy #19

Closed
coreylowman opened this issue May 19, 2022 · 2 comments
Closed

add hard_cross_entropy #19

coreylowman opened this issue May 19, 2022 · 2 comments
Labels
new feature New feature or request

Comments

@coreylowman
Copy link
Owner

Current only works for actual probability distributions. hard cross entropy only has 1 non zero entry in inner dimension, so sum across that before taking mean

@coreylowman coreylowman added the new feature New feature or request label May 19, 2022
@coreylowman
Copy link
Owner Author

After #12, hard could accept a tensor with dtype = usize. For now, cross_entropy could accept enum { HardTargets, SoftTargets } to differentiate?

@coreylowman coreylowman mentioned this issue May 26, 2022
@coreylowman
Copy link
Owner Author

This is not really needed - the same function can be used for both, the user should just use one hot encode the class labels into probabilities before calling

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant