Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

port tests for container transforms #8012

Merged
merged 3 commits into from Oct 3, 2023
Merged

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Oct 2, 2023

@pytorch-bot
Copy link

pytorch-bot bot commented Oct 2, 2023

@@ -396,6 +396,8 @@ def check_transform(transform, input, check_v1_compatibility=True, check_sample_
if check_v1_compatibility:
_check_transform_v1_compatibility(transform, input, **_to_tolerances(check_v1_compatibility))

return output
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We already compute the output in check_transform. By returning it, we don't need to recompute in case the test performs additional checks.

@@ -100,14 +100,15 @@ def _extract_params_for_v1_transform(self) -> Dict[str, Any]:
return {"transforms": self.transforms, "p": self.p}

def forward(self, *inputs: Any) -> Any:
sample = inputs if len(inputs) > 1 else inputs[0]
needs_unpacking = len(inputs) > 1
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This got the same treatment as transforms.Compose in #7758. TL;DR: RandomApply now has the same UX as Compose in terms of passing packed or unpacked inputs.

@@ -173,8 +174,9 @@ def __init__(self, transforms: Sequence[Callable]) -> None:
self.transforms = transforms

def forward(self, *inputs: Any) -> Any:
sample = inputs if len(inputs) > 1 else inputs[0]
needs_unpacking = len(inputs) > 1
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

Comment on lines 1909 to 1913
# horizontal and vertical flip are commutative. Meaning, although the order in the transform is indeed random,
# we don't need to care here.
expected = F.vertical_flip(F.horizontal_flip(input))

assert_equal(actual, expected)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately it also means that this test doesn't check that the order of the transforms is indeed random. At best it checks that the input transforms are applied. I think we should at least acknowledge that in the comment instead of saying "we don't need to care".

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair enough. All ears if you have an idea to check whether the transforms are actually applied in random order.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a pair of non-commutative transforms, with 2 hand-chosen seeds, asserting that the results are different. What makes it a bit harder is that the transforms themselves must be non-random. Worst case scenario we could just define Plus2 and Times2.

@pmeier pmeier merged commit ddfee23 into pytorch:main Oct 3, 2023
47 of 62 checks passed
@pmeier pmeier deleted the port/container branch October 3, 2023 19:56
@github-actions
Copy link

github-actions bot commented Oct 3, 2023

Hey @pmeier!

You merged this PR, but no labels were added. The list of valid labels is available at https://github.com/pytorch/vision/blob/main/.github/process_commit.py

input = make_image()

actual = check_transform(transform, input)
# We can't really check whether the transforms are actually applied in random order. However, horizontal and
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might be too pedantic but I wouldn't say we "can't", we just choose not to.

facebook-github-bot pushed a commit that referenced this pull request Oct 31, 2023
Reviewed By: vmoens

Differential Revision: D50789102

fbshipit-source-id: d95eea91a6574a13c8d53023ab4b2fb65be62ea3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants