Skip to content

Conversation

@davidberard98
Copy link
Contributor

@davidberard98 davidberard98 commented Jan 26, 2023

Stack from ghstack:

for some tensor x, x.type(torch.FloatTensor) will essentially do the same thing as x.to(torch.float). x.type can be called with at least 3 types of inputs:

  • a string "torch.FloatTensor"
  • a dtype torch.float
  • a tensor type torch.FloatTensor

the third option (torch.FloatTensor) fails in fx, because fx cannot trace torch.FloatTensor objects. So this PR will replace the torch.FloatTensor type with a string "torch.FloatTensor"

Why not fix this in fx? Well, it's possible, but I'm not sure a nice way to do it. We would want to update torch.fx.node.BaseArgumentTypes to contain torch.FloatTensor etc. We could hard-code a list of tensor types there (the types vary depending on build type, e.g. whether or not cuda tensors are available), but that's not great in case our hardcoded list differs from the actual list registered by python_tensor.cpp. Another option is to dynamically populate the list of types with Union[tuple(...)]), and fill the tuple with torch._tensor_classes (which is directly populated by python_tensor.cpp), but apparently this breaks most typecheckers.

cc @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @chunyuan-w @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @desertfire

for some tensor x, x.type(torch.FloatTensor) will essentially do the same thing as x.to(torch.float). x.type can be called with at least 3 types of inputs:
* a string "torch.FloatTensor"
* a dtype torch.float
* a tensor type torch.FloatTensor

the third option (torch.FloatTensor) fails in fx, because fx cannot trace torch.FloatTensor objects.

Although we could fix this in fx, it's simpler to fix this in dynamo. This converts x.type(torch.FloatTensor) into x.type("torch.FloatTensor")

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Jan 26, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/93043

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 30a132a:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

davidberard98 added a commit that referenced this pull request Jan 26, 2023
for some tensor x, x.type(torch.FloatTensor) will essentially do the same thing as x.to(torch.float). x.type can be called with at least 3 types of inputs:
* a string "torch.FloatTensor"
* a dtype torch.float
* a tensor type torch.FloatTensor

the third option (torch.FloatTensor) fails in fx, because fx cannot trace torch.FloatTensor objects.

Although we could fix this in fx, it's simpler to fix this in dynamo. This converts x.type(torch.FloatTensor) into x.type("torch.FloatTensor")

ghstack-source-id: eed9550
Pull Request resolved: #93043
@davidberard98 davidberard98 changed the title [WIP][dynamo] support [tensor].type(torch.FloatTensor) [dynamo] support [tensor].type(torch.FloatTensor) Jan 26, 2023
@davidberard98 davidberard98 marked this pull request as ready for review January 26, 2023 17:25
@davidberard98
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jan 27, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@facebook-github-bot facebook-github-bot deleted the gh/davidberard98/166/head branch June 8, 2023 16:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants