Skip to content

Conversation

@kurtamohler
Copy link
Collaborator

Fixes #84874

@kurtamohler kurtamohler added the module: tests Issues related to tests (not the torch.testing module) label Sep 12, 2022
@pytorch-bot
Copy link

pytorch-bot bot commented Sep 12, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/84875

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit fc67425:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's too bad we don't have a great well to tell if tests actually ran or not. What was happening here?

@kurtamohler
Copy link
Collaborator Author

@mruberry, I think it should be possible to write tests for the decorators to make sure they do what we expect. I can look more into that.

To try to explain what was wrong: skipIfTorchDynamo(expectedFailure('meta')(fn)) was returning a decorator function, which if called will return a wrapped function. So the unit test ended up only doing just that--calling the decorator function returned by skipIfTorchDynamo(...) and doing nothing with the returned wrapped function. We need the unit test to execute the wrapped function, not the decorator function, so we need to do skipIfTorchDynamo()(expectedFailure('meta')(fn)). The skipIfTorchDynamo() part returns a decorator function, then the (expectedFailure('meta')(fn)) part calls the decorator function to get the wrapped function, and finally, the unit test executes the wrapped function which contains the expected unit test code.

@kurtamohler
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: Approval needed from one of the following (Rule 'superuser'):
xta0, adamomainz, Nitrokitty, RdoubleA, suo, ...

If you believe this is an error, you can use the old behavior with @pytorchbot merge -g (optionally with the ciflow/trunk to get land checks) or use @pytorchbot merge -f "some reason here". For more information, see the bot wiki.

Please reach out to the PyTorch DevX Team with feedback or questions!

Details for Dev Infra team Raised by workflow job

@kurtamohler
Copy link
Collaborator Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: Approval needed from one of the following (Rule 'superuser'):
houseroad, shreyanb98, bilalsal, chinannyang, d4l3k, ...

Details for Dev Infra team Raised by workflow job

@kurtamohler
Copy link
Collaborator Author

@pytorchbot merge

@mruberry
Copy link
Collaborator

Sorry about the merge issues, @kurtamohler, I think we fixed my reviewer perms now and this should work OK

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here and land check progress here.
The merge job was triggered with the land checks (-l) flag. If you did not specify this flag yourself, you are likely enrolled in the land checks rollout. This means that your change will be merged once all checks on your PR and the land checks have passed (ETA 4 Hours). If you need to coordinate lands between different changes and cannot risk a land race, please add the ciflow/trunk label to your PR and wait for signal to complete, and then land your changes in proper order. Having trunk, pull, and Lint pre-run on a PR will bypass land checks and the ETA should be immediate. If this is not the intended behavior, feel free to use some of the other merge options in the wiki.
Please reach out to the PyTorch DevX Team with feedback or questions!

@kurtamohler
Copy link
Collaborator Author

No problem. I just happened to see you got readded as a superuser, figured I'd try merging again

@github-actions
Copy link
Contributor

Hey @kurtamohler.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

mehtanirav pushed a commit that referenced this pull request Oct 4, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged module: tests Issues related to tests (not the torch.testing module) open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Using expectedFailureMeta decorator prevents tests from running

5 participants