Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discrepancy between anchor_labeler.batch_label_anchors and model output #189

Closed
BardiaKh opened this issue Mar 10, 2021 · 2 comments
Closed

Comments

@BardiaKh
Copy link

BardiaKh commented Mar 10, 2021

Hello,

I am trying to run your code on a medical imaging task, and unfortunately I can't get it to train. The problem seems to be with the output of the anchor_labeler.batch_label_anchors. I am using batch size of 4 and my model's class_out is a tensor of shape [4, 112, 112, 135] (in each of the 5 levels) while the cls_targets of the anchor_labeler is a tensor of shape [4, 128, 128, 135].

I was wondering if there is a problem with the batch_label_anchors method?

My code is as follows:

anchors = effdet.anchors.Anchors.from_config(model_config)
anchor_labeler = effdet.anchors.AnchorLabeler(anchors, num_classes, match_threshold=0.5)

model=model.cuda()
class_out, box_out=model(batch['dcms'].cuda())
cls_targets, box_targets, num_positives= anchor_labeler.batch_label_anchors(batch['target']['bbox'], batch['target']['cls'])
loss, class_loss, box_loss =loss_fn(out[0], out[1], cls_targets, box_targets, num_positives)

I get the following error in binary_cross_entropy_with_logits:

Target size (torch.Size([4, 128, 128, 135])) must be the same as input size (torch.Size([4, 112, 112, 135]))

@rwightman
Copy link
Owner

@BardiaKh looks like it's more likely to be an issue with the size of the image you're feeding to the model, it needs to match the model config and be divisible by 128.

@BardiaKh
Copy link
Author

Thank you! I appreciate your prompt response. That actually was the problem.
Thank you again for the effort you put into promoting the pytorch community.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants