-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
adapt loss to full mask #1426
adapt loss to full mask #1426
Conversation
Codecov ReportPatch coverage has no change and project coverage change:
Additional details and impacted files@@ Coverage Diff @@
## dev-1.x #1426 +/- ##
============================================
+ Coverage 0.02% 86.77% +86.75%
============================================
Files 121 170 +49
Lines 8217 14049 +5832
Branches 1368 2237 +869
============================================
+ Hits 2 12191 +12189
+ Misses 8215 1471 -6744
- Partials 0 387 +387
Flags with carried forward coverage won't be shown. Click here to find out more. see 171 files with indirect coverage changes Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report in Codecov by Sentry. |
@@ -23,7 +23,8 @@ def wrapped(inputs, data_samples, **kwargs): | |||
task_data_samples.append(data_sample.get(task_name)) | |||
|
|||
if len(task_data_samples) == 0: | |||
return {'loss': torch.tensor(0.), 'mask_size': torch.tensor(0.)} | |||
loss = (inputs[0] * 0).sum() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@marouaneamz i think you should explain with a comment here
closed since #1530 merged. |
in the pull request #1229 , for the extreme case that we have a full-mask we do for the loss = 0 which will cause problems to calculate the gradient of the loss. here i use the torch vector of the image to create a loss =0