Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refine the doc of sigmoid_binary_cross_entropy to not assume the meaning of last dimension. #418

Merged
merged 1 commit into from
Sep 16, 2022

Conversation

copybara-service[bot]
Copy link

Refine the doc of sigmoid_binary_cross_entropy to not assume the meaning of last dimension.

This loss is just an elementwise loss and the last dimension can be anything, not necessarily num_classes.
For example, logits can be a vector whose dimension means batch.

@copybara-service copybara-service bot force-pushed the test_474589208 branch 5 times, most recently from 649b1ec to ab52a83 Compare September 16, 2022 15:15
…ing of last dimension.

This loss is just an elementwise loss and the last dimension can be anything, not necessarily `num_classes`.
For example, `logits` can be a vector whose dimension means `batch`.

PiperOrigin-RevId: 474820626
@copybara-service copybara-service bot merged commit 68bec46 into master Sep 16, 2022
@copybara-service copybara-service bot deleted the test_474589208 branch September 16, 2022 15:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant