This repository was archived by the owner on Aug 21, 2025. It is now read-only.
-
Couldn't load subscription status.
- Fork 105
This repository was archived by the owner on Aug 21, 2025. It is now read-only.
test_tensor_ctor_inside_grad is failing on cuda on master #806
Copy link
Copy link
Closed
Labels
actionableIt is clear what should be done for this issueIt is clear what should be done for this issuehigh priorityThese issues are at the top of mind for us.These issues are at the top of mind for us.
Description
The skip is buried a little bit (also I couldn't get an xfail working and then kinda gave up), so making an issue so that we don't lose it. This seems weird but the stack trace we're seeing:
Traceback (most recent call last):
File "/home/circleci/project/env/lib/python3.7/site-packages/torch/testing/_internal/common_utils.py", line 1808, in wrapper
method(*args, **kwargs)
File "/home/circleci/project/env/lib/python3.7/site-packages/torch/testing/_internal/common_utils.py", line 1808, in wrapper
method(*args, **kwargs)
File "/home/circleci/project/env/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 390, in instantiated_test
raise rte
File "/home/circleci/project/env/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 377, in instantiated_test
result = test(self, **param_kwargs)
File "test/test_eager_transforms.py", line 821, in test_tensor_ctor_inside_grad
functorch.grad(foo)(x)
File "/home/circleci/project/functorch/_src/eager_transforms.py", line 1192, in wrapper
results = grad_and_value(func, argnums, has_aux=has_aux)(*args, **kwargs)
File "/home/circleci/project/functorch/_src/eager_transforms.py", line 1062, in wrapper
output = func(*args, **kwargs)
File "test/test_eager_transforms.py", line 818, in foo
return x * torch.tensor(2., device=device)
RuntimeError: Cannot access data pointer of Tensor that doesn't have storage
Metadata
Metadata
Assignees
Labels
actionableIt is clear what should be done for this issueIt is clear what should be done for this issuehigh priorityThese issues are at the top of mind for us.These issues are at the top of mind for us.