Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

allow sequence fill for v2 AA scripted #7919

Merged
merged 1 commit into from Aug 31, 2023
Merged

allow sequence fill for v2 AA scripted #7919

merged 1 commit into from Aug 31, 2023

Conversation

pmeier
Copy link
Collaborator

@pmeier pmeier commented Aug 31, 2023

#7839 fixed the v2 AA transform family to not be JIT scriptable, since we unconditionally converted the user input for fill into a dictionary. However, the fix made it more strict than it needed to be:

We currently only allow scalar values for fill

if not (params["fill"] is None or isinstance(params["fill"], (int, float))):
raise ValueError(f"{type(self).__name__}() can only be scripted for a scalar `fill`, but got {self.fill}.")

but the v1 equivalent works with sequences as well, e.g.

@pytest.mark.parametrize("fill", [None, 85, (10, -10, 10), 0.7, [0.0, 0.0, 0.0], [1], 1])
def test_autoaugment(device, policy, fill):
tensor = torch.randint(0, 256, size=(3, 44, 56), dtype=torch.uint8, device=device)
batch_tensors = torch.randint(0, 256, size=(4, 3, 44, 56), dtype=torch.uint8, device=device)
transform = T.AutoAugment(policy=policy, fill=fill)
s_transform = torch.jit.script(transform)

Thus, we only need to exclude the case of a user supplied fill dictionary.

cc @vfdev-5

@pytorch-bot
Copy link

pytorch-bot bot commented Aug 31, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/vision/7919

Note: Links to docs will display an error until the docs builds have been completed.

❌ 3 New Failures, 4 Unrelated Failures

As of commit ac6bd0c with merge base 96950a5 (image):

NEW FAILURES - The following jobs have failed:

FLAKY - The following jobs failed but were likely due to flakiness present on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot @pmeier , LGTM when green

@pmeier pmeier merged commit b828671 into pytorch:main Aug 31, 2023
56 of 63 checks passed
@pmeier pmeier deleted the aa-fill branch August 31, 2023 11:08
pmeier added a commit to pmeier/vision that referenced this pull request Aug 31, 2023
facebook-github-bot pushed a commit that referenced this pull request Sep 6, 2023
Reviewed By: matteobettini

Differential Revision: D48900369

fbshipit-source-id: 42c6f62196d7d53bd4d9e8b2d75494de8ba966ce
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants