-
Notifications
You must be signed in to change notification settings - Fork 25.5k
[TensorExpr] Implement shape inference for TE. #41451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. ghstack-source-id: 458c317 Pull Request resolved: #41451
💊 CI failures summary and remediationsAs of commit 4c3383f (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 28 times. |
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Accepting to unblock. We're going to need to think up a more comprehensive testing strategy before enabling TE Fuser on in master, which should hopefully flesh out any issues here. I also suspect a lot of this will need to be rethought and revisited if/when we enable TE with unknown dimensions.
@Krovatkin maybe you can take a look as well
torch/csrc/jit/tensorexpr/kernel.cpp
Outdated
static std::pair<std::vector<ExprHandle>, bool> broadcastShapes( | ||
std::vector<ExprHandle> TensorExprKernel::broadcastShapes( | ||
std::vector<std::vector<ExprHandle>> shapes) { | ||
bool seenBroadcast = false; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unused
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
Since TE operates on a limited subset of ops with a well-defined semantics, we can easily infer shapes of intermediate and output tensors given shapes of the inputs. There is a couple of ops that are not yet supported in the shape inference, once we add them we could relax the shape info requirements in the TE fuser: currently it requires all values in the fusion group to have shapes known and we can change it to only inputs. Differential Revision: [D22543470](https://our.internmc.facebook.com/intern/diff/D22543470) [ghstack-poisoned]
@ZolotukhinM merged this pull request in 2deccce. |
Stack from ghstack:
Since TE operates on a limited subset of ops with a well-defined
semantics, we can easily infer shapes of intermediate and output tensors
given shapes of the inputs.
There is a couple of ops that are not yet supported in the shape
inference, once we add them we could relax the shape info requirements
in the TE fuser: currently it requires all values in the fusion group to
have shapes known and we can change it to only inputs.
Differential Revision: D22543470