New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CrossEntropyLoss for 3D and higher #3556
Comments
agreed, would be good to have. We'll get this done. |
Looking into this |
fixed via #4035 |
@soumith and @zou3519 : Thanks for it, I have same problem. I am implementing 3D Unet loss function.
How can I use CrossEntropyLoss to compute the loss? this is my current way, but not sure the correction
Another way may be converted to 2D
Thanks! |
@John1231983 a better place to ask questions is the forums; your question will have higher visibility there than on a closed issue. Does |
@zou3519; yes, it does not work because CrossEntropyLoss only work when my input size like |
Going hand-in-hand with #1020 and #1260 it would be nice to have the same multi-dim behavior for the CrossEntropyLoss as is now for softmax. Similar to https://www.tensorflow.org/api_docs/python/tf/nn/sparse_softmax_cross_entropy_with_logits
where the loss is is evaluated along the last provided dimension, i.e.
At the moment one has to reshape into 2D before and back again after the operation.
It would not require a signature change and would not break current code that use 2D input into the loss.
Any thoughts?
The text was updated successfully, but these errors were encountered: