New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
{ai}[foss/2022a] PyTorch v2.0.1 #19066
{ai}[foss/2022a] PyTorch v2.0.1 #19066
Conversation
…2.0.1_add-missing-vsx-vector-shift-functions.patch, PyTorch-2.0.1_avoid-test_quantization-failures.patch, PyTorch-2.0.1_disable-test-sharding.patch, PyTorch-2.0.1_fix-numpy-compat.patch, PyTorch-2.0.1_fix-shift-ops.patch, PyTorch-2.0.1_fix-skip-decorators.patch, PyTorch-2.0.1_fix-test_memory_profiler.patch, PyTorch-2.0.1_fix-test-ops-conf.patch, PyTorch-2.0.1_fix-torch.compile-on-ppc.patch, PyTorch-2.0.1_fix-ub-in-inductor-codegen.patch, PyTorch-2.0.1_fix-vsx-loadu.patch, PyTorch-2.0.1_no-cuda-stubs-rpath.patch, PyTorch-2.0.1_remove-test-requiring-online-access.patch, PyTorch-2.0.1_skip-diff-test-on-ppc.patch, PyTorch-2.0.1_skip-failing-gradtest.patch, PyTorch-2.0.1_skip-test_shuffle_reproducibility.patch, PyTorch-2.0.1_skip-tests-skipped-in-subprocess.patch
This comment was marked as outdated.
This comment was marked as outdated.
Test report by @Flamefire |
@boegelbot please test @ jsc-zen2 |
@boegel: Request for testing this PR well received on jsczen2l1.int.jsc-zen2.easybuild-test.cluster PR test command '
Test results coming soon (I hope)... - notification for comment with ID 1778707471 processed Message to humans: this is just bookkeeping information for me, |
Test report by @boegelbot |
Test report by @SebastianAchilles |
Test report by @branfosj |
Test report by @boegel |
@branfosj Can you dig into the log file and extract more details on the failing My vote goes to ignoring this test for now, so we can merge this PR and follow-up in another PR to get that (quirky?) test fixed, since we've seen success on a range of different systems here (incl. POWER!). |
With the traceback being short:
|
I added a patch to disable that check (and similar ones) by setting a flag (could also be set by an env var) which catches unexpected success in this test. This should make this test succeed without any potential influence on other tests. See https://github.com/pytorch/pytorch/blob/v2.0.1/test/inductor/test_torchinductor_opinfo.py#L605-L608 I think with that patch added we can consider the report as a success and merge this without another test (I verified with |
Test report by @branfosj |
Going in, thanks @Flamefire! |
Test report by @Flamefire |
Test report by @VRehnberg |
Test report by @VRehnberg |
(created using
eb --new-pr
)