-
Notifications
You must be signed in to change notification settings - Fork 26.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Returning outputs only when asked for for MaskFormer. #15936
Returning outputs only when asked for for MaskFormer. #15936
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing this!
@@ -2421,6 +2428,7 @@ def forward( | |||
mask_labels: Optional[Tensor] = None, | |||
class_labels: Optional[Tensor] = None, | |||
pixel_mask: Optional[Tensor] = None, | |||
output_auxiliary_logits: Optional[bool] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should probably go with a config argument (like output_hidden_states
etc) that would give a default. Otherwise defaulting to None
makes no sense.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I made the change but will wait for @FrancescoSaverioZuppichini advice to make sure this is sound.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Just a little bit of context, the auxiliary_logits
are consumed by the loss if use_auxiliary_loss
was set to true. So we could remove them from the output or add a flag in the config to return them
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So if someone uses both output_auxiliary_logits=True
and use_auxiliary_loss=True
then it wouldn't work currently is that it ? (If so we need to change the consuming behavior into a copy I think)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Friendly ping @FrancescoSaverioZuppichini
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will work :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! LGTM
What does this PR do?
Change the return output from
()
toNone
which seems more aligned withthe rest of the library.
Also
auxiliary_logits
seem optional and don't seem to be used by the featureextractor, so this PR makes them optional too.
Note: I couldn't test the modeling tests, there seem to be no fast tests, and the slow tests are failing for reasons seemingly unrelated to this PR.
Fixes # (issue)
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.