Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DISABLED test_allocation_id_uniqueness (__main__.TestTorchTidyProfiler) #125021

Open
pytorch-bot bot opened this issue Apr 26, 2024 · 2 comments
Open

DISABLED test_allocation_id_uniqueness (__main__.TestTorchTidyProfiler) #125021

pytorch-bot bot opened this issue Apr 26, 2024 · 2 comments
Labels
module: dynamo module: flaky-tests Problem is a flaky test in CI oncall: profiler profiler-related issues (cpu, gpu, kineto) oncall: pt2 skipped Denotes a (flaky) test currently skipped in CI. triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@pytorch-bot
Copy link

pytorch-bot bot commented Apr 26, 2024

Platforms: dynamo

This test was disabled because it is failing in CI. See recent examples and the most recent trunk workflow logs.

Over the past 3 hours, it has been determined flaky in 9 workflow(s) with 9 failures and 9 successes.

Debugging instructions (after clicking on the recent samples link):
DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:

  1. Click on the workflow logs linked above
  2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
  3. Grep for test_allocation_id_uniqueness
  4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
Sample error message
Traceback (most recent call last):
  File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/testing/_internal/common_utils.py", line 2831, in wrapper
    raise RuntimeError(f"Unexpected success, please remove `test/dynamo_expected_failures/{test_name}`")
RuntimeError: Unexpected success, please remove `test/dynamo_expected_failures/TestTorchTidyProfiler.test_allocation_id_uniqueness`

Test file path: profiler/test_torch_tidy.py

cc @clee2000 @robieta @chaekit @aaronenyeshi @guotuofeng @guyang3532 @dzhulgakov @davidberard98 @briancoutinho @sraikund16 @sanrise @ezyang @msaroufim @bdhirsh @anijain2305 @chauhang @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng

@pytorch-bot pytorch-bot bot added module: flaky-tests Problem is a flaky test in CI oncall: profiler profiler-related issues (cpu, gpu, kineto) oncall: pt2 skipped Denotes a (flaky) test currently skipped in CI. labels Apr 26, 2024
Copy link
Author

pytorch-bot bot commented Apr 26, 2024

Hello there! From the DISABLED prefix in this issue title, it looks like you are attempting to disable a test in PyTorch CI. The information I have parsed is below:
  • Test name: test_allocation_id_uniqueness (__main__.TestTorchTidyProfiler)
  • Platforms for which to skip the test: dynamo
  • Disabled by pytorch-bot[bot]

Within ~15 minutes, test_allocation_id_uniqueness (__main__.TestTorchTidyProfiler) will be disabled in PyTorch CI for these platforms: dynamo. Please verify that your test name looks correct, e.g., test_cuda_assert_async (__main__.TestCuda).

To modify the platforms list, please include a line in the issue body, like below. The default action will disable the test for all platforms if no platforms list is specified.

Platforms: case-insensitive, list, of, platforms

We currently support the following platforms: asan, dynamo, inductor, linux, mac, macos, rocm, slow, win, windows.

@jbschlosser
Copy link
Contributor

The "recent examples" link didn't show me the error that occurs when it fails, so here it is:

log
2024-01-12T14:29:08.6612009Z ==================================== RERUNS ====================================
2024-01-12T14:29:08.6612340Z _____________ TestTorchTidyProfiler.test_allocation_id_uniqueness ______________
2024-01-12T14:29:08.6612469Z Traceback (most recent call last):
2024-01-12T14:29:08.6612804Z   File "profiler/test_profiler.py", line 2302, in test_allocation_id_uniqueness
2024-01-12T14:29:08.6612905Z     gc.collect()
2024-01-12T14:29:08.6613537Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/eval_frame.py", line 580, in catch_errors
2024-01-12T14:29:08.6613766Z     return callback(frame, cache_entry, hooks, frame_state)
2024-01-12T14:29:08.6614428Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 741, in _convert_frame
2024-01-12T14:29:08.6614673Z     result = inner_convert(frame, cache_entry, hooks, frame_state)
2024-01-12T14:29:08.6615390Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 384, in _convert_frame_assert
2024-01-12T14:29:08.6615538Z     return _compile(
2024-01-12T14:29:08.6616156Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 670, in _compile
2024-01-12T14:29:08.6616389Z     raise InternalTorchDynamoError(str(e)).with_traceback(
2024-01-12T14:29:08.6617002Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 643, in _compile
2024-01-12T14:29:08.6617272Z     guarded_code = compile_inner(code, one_graph, hooks, transform)
2024-01-12T14:29:08.6617864Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/utils.py", line 247, in time_wrapper
2024-01-12T14:29:08.6617979Z     r = func(*args, **kwargs)
2024-01-12T14:29:08.6618636Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 524, in compile_inner
2024-01-12T14:29:08.6618819Z     out_code = transform_code_object(code, transform)
2024-01-12T14:29:08.6619595Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/bytecode_transformation.py", line 1033, in transform_code_object
2024-01-12T14:29:08.6619806Z     transformations(instructions, code_options)
2024-01-12T14:29:08.6620396Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 151, in _fn
2024-01-12T14:29:08.6620524Z     return fn(*args, **kwargs)
2024-01-12T14:29:08.6621175Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 489, in transform
2024-01-12T14:29:08.6621307Z     tracer.run()
2024-01-12T14:29:08.6621928Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 2098, in run
2024-01-12T14:29:08.6622028Z     super().run()
2024-01-12T14:29:08.6622634Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 780, in run
2024-01-12T14:29:08.6622752Z     and self.step()
2024-01-12T14:29:08.6623366Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 743, in step
2024-01-12T14:29:08.6623505Z     getattr(self, inst.opname)(inst)
2024-01-12T14:29:08.6624154Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 1282, in LOAD_ATTR
2024-01-12T14:29:08.6624340Z     result = BuiltinVariable(getattr).call_function(
2024-01-12T14:29:08.6625028Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/builtin.py", line 650, in call_function
2024-01-12T14:29:08.6625167Z     result = handler(tx, *args, **kwargs)
2024-01-12T14:29:08.6625840Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/builtin.py", line 1238, in call_getattr
2024-01-12T14:29:08.6625978Z     return obj.var_getattr(tx, name)
2024-01-12T14:29:08.6626616Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/base.py", line 257, in var_getattr
2024-01-12T14:29:08.6626755Z     value = self.const_getattr(tx, name)
2024-01-12T14:29:08.6627592Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/constant.py", line 128, in const_getattr
2024-01-12T14:29:08.6627732Z     member = getattr(self.value, name)
2024-01-12T14:29:08.6628235Z torch._dynamo.exc.InternalTorchDynamoError: 'NoneType' object has no attribute 'profiler'
2024-01-12T14:29:08.6628242Z 
2024-01-12T14:29:08.6628345Z from user code:
2024-01-12T14:29:08.6628759Z    File "profiler/test_profiler.py", line 2304, in resume_in_test_allocation_id_uniqueness_at_2302
2024-01-12T14:29:08.6629017Z     roots = p.profiler.kineto_results.experimental_event_tree()
2024-01-12T14:29:08.6629022Z 
2024-01-12T14:29:08.6629311Z Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
2024-01-12T14:29:08.6629317Z 
2024-01-12T14:29:08.6629322Z 
2024-01-12T14:29:08.6629610Z You can suppress this exception and fall back to eager by setting:
2024-01-12T14:29:08.6629764Z     import torch._dynamo
2024-01-12T14:29:08.6629936Z     torch._dynamo.config.suppress_errors = True
2024-01-12T14:29:08.6629942Z 
2024-01-12T14:29:08.6629947Z 
2024-01-12T14:29:08.6630210Z To execute this test, run the following from the base repo dir:
2024-01-12T14:29:08.6630637Z     PYTORCH_TEST_WITH_DYNAMO=1 python test_profiler.py -k test_allocation_id_uniqueness
2024-01-12T14:29:08.6630643Z 
2024-01-12T14:29:08.6630961Z This message can be suppressed by setting PYTORCH_PRINT_REPRO_ON_FAILURE=0
2024-01-12T14:29:08.6631291Z _____________ TestTorchTidyProfiler.test_allocation_id_uniqueness ______________
2024-01-12T14:29:08.6631419Z Traceback (most recent call last):
2024-01-12T14:29:08.6631750Z   File "profiler/test_profiler.py", line 2302, in test_allocation_id_uniqueness
2024-01-12T14:29:08.6631848Z     gc.collect()
2024-01-12T14:29:08.6632476Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/eval_frame.py", line 580, in catch_errors
2024-01-12T14:29:08.6632709Z     return callback(frame, cache_entry, hooks, frame_state)
2024-01-12T14:29:08.6633366Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 741, in _convert_frame
2024-01-12T14:29:08.6633645Z     result = inner_convert(frame, cache_entry, hooks, frame_state)
2024-01-12T14:29:08.6634367Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 384, in _convert_frame_assert
2024-01-12T14:29:08.6634505Z     return _compile(
2024-01-12T14:29:08.6635274Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 670, in _compile
2024-01-12T14:29:08.6635499Z     raise InternalTorchDynamoError(str(e)).with_traceback(
2024-01-12T14:29:08.6636116Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 643, in _compile
2024-01-12T14:29:08.6636385Z     guarded_code = compile_inner(code, one_graph, hooks, transform)
2024-01-12T14:29:08.6636976Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/utils.py", line 247, in time_wrapper
2024-01-12T14:29:08.6637093Z     r = func(*args, **kwargs)
2024-01-12T14:29:08.6637756Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 524, in compile_inner
2024-01-12T14:29:08.6637944Z     out_code = transform_code_object(code, transform)
2024-01-12T14:29:08.6638739Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/bytecode_transformation.py", line 1033, in transform_code_object
2024-01-12T14:29:08.6638912Z     transformations(instructions, code_options)
2024-01-12T14:29:08.6639498Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 151, in _fn
2024-01-12T14:29:08.6639625Z     return fn(*args, **kwargs)
2024-01-12T14:29:08.6640249Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/convert_frame.py", line 489, in transform
2024-01-12T14:29:08.6640350Z     tracer.run()
2024-01-12T14:29:08.6640976Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 2098, in run
2024-01-12T14:29:08.6641076Z     super().run()
2024-01-12T14:29:08.6641683Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 780, in run
2024-01-12T14:29:08.6641799Z     and self.step()
2024-01-12T14:29:08.6642407Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 743, in step
2024-01-12T14:29:08.6642553Z     getattr(self, inst.opname)(inst)
2024-01-12T14:29:08.6643198Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/symbolic_convert.py", line 1282, in LOAD_ATTR
2024-01-12T14:29:08.6643382Z     result = BuiltinVariable(getattr).call_function(
2024-01-12T14:29:08.6644068Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/builtin.py", line 650, in call_function
2024-01-12T14:29:08.6644205Z     result = handler(tx, *args, **kwargs)
2024-01-12T14:29:08.6644918Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/builtin.py", line 1238, in call_getattr
2024-01-12T14:29:08.6645056Z     return obj.var_getattr(tx, name)
2024-01-12T14:29:08.6645700Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/base.py", line 257, in var_getattr
2024-01-12T14:29:08.6645851Z     value = self.const_getattr(tx, name)
2024-01-12T14:29:08.6646536Z   File "/opt/conda/envs/py_3.8/lib/python3.8/site-packages/torch/_dynamo/variables/constant.py", line 128, in const_getattr
2024-01-12T14:29:08.6646673Z     member = getattr(self.value, name)
2024-01-12T14:29:08.6647173Z torch._dynamo.exc.InternalTorchDynamoError: 'NoneType' object has no attribute 'profiler'
2024-01-12T14:29:08.6647179Z 
2024-01-12T14:29:08.6647279Z from user code:
2024-01-12T14:29:08.6647696Z    File "profiler/test_profiler.py", line 2304, in resume_in_test_allocation_id_uniqueness_at_2302
2024-01-12T14:29:08.6647958Z     roots = p.profiler.kineto_results.experimental_event_tree()
2024-01-12T14:29:08.6647964Z 
2024-01-12T14:29:08.6648252Z Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
2024-01-12T14:29:08.6648257Z 
2024-01-12T14:29:08.6648304Z 
2024-01-12T14:29:08.6648596Z You can suppress this exception and fall back to eager by setting:
2024-01-12T14:29:08.6648710Z     import torch._dynamo
2024-01-12T14:29:08.6648888Z     torch._dynamo.config.suppress_errors = True
2024-01-12T14:29:08.6648919Z 
2024-01-12T14:29:08.6648954Z 
2024-01-12T14:29:08.6649227Z To execute this test, run the following from the base repo dir:
2024-01-12T14:29:08.6649660Z     PYTORCH_TEST_WITH_DYNAMO=1 python test_profiler.py -k test_allocation_id_uniqueness
2024-01-12T14:29:08.6649665Z 
2024-01-12T14:29:08.6650000Z This message can be suppressed by setting PYTORCH_PRINT_REPRO_ON_FAILURE=0

@jbschlosser jbschlosser added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 29, 2024
@zou3519 zou3519 closed this as completed Jun 14, 2024
@zou3519 zou3519 reopened this Jun 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: dynamo module: flaky-tests Problem is a flaky test in CI oncall: profiler profiler-related issues (cpu, gpu, kineto) oncall: pt2 skipped Denotes a (flaky) test currently skipped in CI. triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

2 participants