You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to run your code on a medical imaging task, and unfortunately I can't get it to train. The problem seems to be with the output of the anchor_labeler.batch_label_anchors. I am using batch size of 4 and my model's class_out is a tensor of shape [4, 112, 112, 135] (in each of the 5 levels) while the cls_targets of the anchor_labeler is a tensor of shape [4, 128, 128, 135].
I was wondering if there is a problem with the batch_label_anchors method?
@BardiaKh looks like it's more likely to be an issue with the size of the image you're feeding to the model, it needs to match the model config and be divisible by 128.
Thank you! I appreciate your prompt response. That actually was the problem.
Thank you again for the effort you put into promoting the pytorch community.
Hello,
I am trying to run your code on a medical imaging task, and unfortunately I can't get it to train. The problem seems to be with the output of the anchor_labeler.batch_label_anchors. I am using batch size of 4 and my model's
class_out
is a tensor of shape[4, 112, 112, 135]
(in each of the 5 levels) while thecls_targets
of the anchor_labeler is a tensor of shape[4, 128, 128, 135]
.I was wondering if there is a problem with the batch_label_anchors method?
My code is as follows:
I get the following error in binary_cross_entropy_with_logits:
Target size (torch.Size([4, 128, 128, 135])) must be the same as input size (torch.Size([4, 112, 112, 135]))
The text was updated successfully, but these errors were encountered: