Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CrossEntropyLoss for 3D and higher #3556

Closed
wadimkehl opened this issue Nov 7, 2017 · 6 comments
Closed

CrossEntropyLoss for 3D and higher #3556

wadimkehl opened this issue Nov 7, 2017 · 6 comments
Assignees

Comments

@wadimkehl
Copy link

Going hand-in-hand with #1020 and #1260 it would be nice to have the same multi-dim behavior for the CrossEntropyLoss as is now for softmax. Similar to https://www.tensorflow.org/api_docs/python/tf/nn/sparse_softmax_cross_entropy_with_logits
where the loss is is evaluated along the last provided dimension, i.e.

labels: shape [d_0, d_1, ..., d_{r-1}]
logits: shape [d_0, d_1, ..., d_{r-1}, num_classes]

At the moment one has to reshape into 2D before and back again after the operation.
It would not require a signature change and would not break current code that use 2D input into the loss.
Any thoughts?

@soumith
Copy link
Member

soumith commented Dec 1, 2017

agreed, would be good to have. We'll get this done.

@soumith soumith added this to neural-nets in Issue Categories Dec 1, 2017
@zou3519
Copy link
Contributor

zou3519 commented Dec 4, 2017

Looking into this

@soumith
Copy link
Member

soumith commented Jan 12, 2018

fixed via #4035

@soumith soumith closed this as completed Jan 12, 2018
@soumith soumith removed this from neural-nets in Issue Categories Feb 20, 2018
@John1231983
Copy link

John1231983 commented May 11, 2018

@soumith and @zou3519 : Thanks for it, I have same problem. I am implementing 3D Unet loss function.

  • The labels size is batch_size x depth x height x width
  • The logits size is batch_size x number_class x depth x height x width

How can I use CrossEntropyLoss to compute the loss? this is my current way, but not sure the correction

criterion = nn.CrossEntropyLoss()
for epoch in range (num_epoch): 
     loss = 0
     for deep_slice in range(depth):
            loss += criterion(logits[:, :, deep_slice], labels[:, deep_slice])
     loss=loss_seg/depth
     loss.backward()

Another way may be converted to 2D

B,C,D,H,W=outputs.size()
logits_2d=logits.view(B,C,D, -1)
labels_2d = labels.view(B, D,-1)
loss = criterion(logits_2d, labels_2d)

Thanks!

@zou3519
Copy link
Contributor

zou3519 commented May 11, 2018

@John1231983 a better place to ask questions is the forums; your question will have higher visibility there than on a closed issue.

Does criterion(logits, labels) not work for you?

@John1231983
Copy link

John1231983 commented May 11, 2018

@zou3519; yes, it does not work because CrossEntropyLoss only work when my input size like B x C x H x W but my case is Bx C x D x Hx W
Update: sorry it works now with my input size. I think I have upgrade pytorch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants