-
Notifications
You must be signed in to change notification settings - Fork 7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make RandomApply torchscriptable in V2 #7256
Conversation
def _extract_params_for_v1_transform(self) -> Dict[str, Any]: | ||
return {"transforms": self.transforms, "p": self.p} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand why, but the transforms
key was missing when using _extract_params_for_v1_transform()
from the base class.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nn.Module
overwrites __setattr__
. One of the things it is doing is to not assign other nn.Module
's into self.__dict__
, but rather into self._modules
. Since we only go through self.__dict__
, we don't pick up self.transforms
since isinstance(nn.ModuleList(...), nn.Module)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wow, good catch! thanks for the explanation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Nicolas!
Hey @NicolasHug! You merged this PR, but no labels were added. The list of valid labels is available at https://github.com/pytorch/vision/blob/main/.github/process_commit.py |
Reviewed By: vmoens Differential Revision: D44416608 fbshipit-source-id: 1e8afbc880dacacacbd2f3e543d21cb4b90e5fdf
We missed it before
cc @vfdev-5 @bjuncek @pmeier