Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve UX for v2 Compose #7758

Merged
merged 5 commits into from
Jul 25, 2023
Merged

improve UX for v2 Compose #7758

merged 5 commits into from
Jul 25, 2023

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Jul 25, 2023

The way Compose is currently implemented always passes the input "packed" t the individual transforms. Meaning, if one passes Compose([...])(image, label) individually (on "unpacked"), the transforms inside the Compose receive (image, label), i.e. the individual inputs packed into a tuple.

For our builtin transforms this makes no difference, since we operate on arbitrary input structures anyway. However, this becomes relevant as soon as a user wants to add a custom transform into the same pipeline. Assuming the user doesn't want to bother with our whole protocol and knows the sample structure exactly, they like want to write something like

class MyTransform(nn.Module):
    def forward(self, image, label):
        ...

However, dropping this into a Compose would not work when passing image and label individually, due to the packing behavior described above. Instead the user would have to write

class MyTransform(nn.Module):
    def forward(self, sample):
        image, label = sample
        ...

This is somewhat awkward, since the input of the individual transform no longer conforms to the input of the Compose.

This transform fixes this behavior. This means, if the inputs to the Compose are unpacked, they also have to be unpacked in the signature of the individual transform. Same is true for the other way around: if the input to the Compose is packed, the individual transform also needs to take a packed input.

cc @vfdev-5

@pytorch-bot
Copy link

pytorch-bot bot commented Jul 25, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/vision/7758

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 8 New Failures

As of commit f8c326e:

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

test/test_transforms_v2_refactored.py Outdated Show resolved Hide resolved
return image, label

@pytest.mark.parametrize(
"transform_clss",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not call this transform_class ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would have to be transform_classes, since it is a list of classes. And since we use cls for singular, I'm usually just append an s to it. I'll leave it up to you.

test/test_transforms_v2_refactored.py Outdated Show resolved Hide resolved
Comment on lines 51 to 52
if not self.transforms:
return inputs if needs_unpacking else inputs[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we just error in init? I dont know if it's super valuable to special-case this?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Depends. The old Compose works as a no-op in case you don't put any transforms in. However, this doesn't sound like a valid use case. So I'm ok in putting this into the constructor.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went with the error. LMK if you want the no-op behavior back.

test/test_transforms_v2_refactored.py Outdated Show resolved Hide resolved
test/test_transforms_v2_refactored.py Outdated Show resolved Hide resolved
@pmeier pmeier requested a review from NicolasHug July 25, 2023 14:05
Co-authored-by: Nicolas Hug <contact@nicolas-hug.com>
Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM if green, thank you!

@pmeier pmeier merged commit cc0f9d0 into pytorch:main Jul 25, 2023
53 of 61 checks passed
@pmeier pmeier deleted the compose branch July 25, 2023 15:03
facebook-github-bot pushed a commit that referenced this pull request Aug 25, 2023
Summary: Co-authored-by: Nicolas Hug <contact@nicolas-hug.com>

Reviewed By: matteobettini

Differential Revision: D48642249

fbshipit-source-id: 48053ff1f6a19d62264500358a4df125751b71e6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants