-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Create a mechanism to trigger tests that only warn once #47624
Comments
If I understand the code correctly, it seems But that is only one part of the story. Overriding the behaviour of |
I was hoping we could do something like set a global context, "setAlwaysWarn(true/false)", that TORCH_WARN_ONCE would query and, if the global flag was true, it would then warn like TORCH_WARN does. |
Is this the direction you were thinking?
Then
Is there a warning context manager in the testing framework, or should we use the |
That's close to what I was thinking of. I was thinking we'd keep TORCH_WARN_ONCE, and there would be a flag, like I was thinking the flag would be controlled by a PyTorch global context (like the deterministic flag). We currently use a the Python warning filter in our test suite and that seems to be OK, I don't think we'd need to use numpy.testing.suppress_warnings. Basically we want this flag to be just for debugging. We'll set it to true during the test suite so we can verify that these warnings are thrown. Does that sound good? |
Now that the |
Summary: Toward fixing #47624 ~Step 1: add `TORCH_WARN_MAYBE` which can either warn once or every time in c++, and add a c++ function to toggle the value. Step 2 will be to expose this to python for tests. Should I continue in this PR or should we take a different approach: add the python level exposure without changing any c++ code and then over a series of PRs change each call site to use the new macro and change the tests to make sure it is being checked?~ Step 1: add a python and c++ toggle to convert TORCH_WARN_ONCE into TORCH_WARN so the warnings can be caught in tests Step 2: add a python-level decorator to use this toggle in tests Step 3: (in future PRs): use the decorator to catch the warnings instead of `maybeWarnsRegex` Pull Request resolved: #48560 Reviewed By: ngimel Differential Revision: D26171175 Pulled By: mruberry fbshipit-source-id: d83c18f131d282474a24c50f70a6eee82687158f
I think this issue is closed. @mattip, would you file a separate issue to audit uses of maybeWarnsRegex in the test suite? |
See issue #52025 |
Summary: Toward fixing pytorch#47624 ~Step 1: add `TORCH_WARN_MAYBE` which can either warn once or every time in c++, and add a c++ function to toggle the value. Step 2 will be to expose this to python for tests. Should I continue in this PR or should we take a different approach: add the python level exposure without changing any c++ code and then over a series of PRs change each call site to use the new macro and change the tests to make sure it is being checked?~ Step 1: add a python and c++ toggle to convert TORCH_WARN_ONCE into TORCH_WARN so the warnings can be caught in tests Step 2: add a python-level decorator to use this toggle in tests Step 3: (in future PRs): use the decorator to catch the warnings instead of `maybeWarnsRegex` Pull Request resolved: pytorch#48560 Reviewed By: ngimel Differential Revision: D26171175 Pulled By: mruberry fbshipit-source-id: d83c18f131d282474a24c50f70a6eee82687158f
Today in PyTorch we have TORCH_WARN and TORCH_WARN_ONCE. TORCH_WARN_ONCE, as its name suggests, only throws a warning one time. This makes it tricky to test for. Our test suite, for example, has maybeWarnsRegex, but this just verifies that no other warnings are thrown.
This feature request is for a mechanism to consistently trigger warnings that use TORCH_WARN_ONCE. One mechanism for this would be to add a new context that, if set, causes these warnings to always warn. This would let us reliably test that these warnings are thrown correctly.
cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @mruberry @VitalyFedyunin @walterddr
The text was updated successfully, but these errors were encountered: