Skip to content

Conversation

@antoniojkim
Copy link
Collaborator

First stage of breaking up #74710

Moves the shape and operand definitions from TsNode to the base Node

CC: @wconstab @JackCaoG @henrytwo

Partially Fixes #74628

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Apr 4, 2022

🔗 Helpful links

💊 CI failures summary and remediations

As of commit 11aabbf (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

hash = HashCombine(hash, static_cast<uint64_t>(kNullOpt));
continue;
}
auto operand_hash = bakeInSizes ? operand.hash_with_sizes() : operand.hash_without_sizes();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JackCaoG I don't think the 'hash_with_sizes/without_sizes` here is a big deal since they are APIs in Node:: already and that's the bigger issue. OK with you to land this and then clean up those methods separately?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wonjoolee95
We actually have own version of GetOperandHashes. I think the constructor is a bit messy right now as we explicitly overwrite the constructor to take xla::shape and we currently ignore lazy::shape.

Node::Node(OpKind op, OpList operands, std::vector<Shape>&& shapes,
           size_t num_outputs, hash_t hash_seed)

will not affect us until we adapt codegen and start passing lazy::shape around. I am OK with this change now.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing that out. That is correct, this should be a no-op from XLA's side.

@wonjoo-wj
Copy link
Collaborator

Thanks! LGTM from PT/XLA's side. I can follow-up with removing some of the code on XLA's side once this PR merges.

cc @JackCaoG

Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Pytorch/XLA can have a follow up pr to clean up our Node.

@facebook-github-bot
Copy link
Contributor

@wconstab has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@bdhirsh bdhirsh added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 5, 2022
facebook-github-bot pushed a commit that referenced this pull request Apr 6, 2022
Summary:
First stage of breaking up #74710

Moves the shape and operand definitions from `TsNode` to the base `Node`

CC: wconstab JackCaoG henrytwo

Partially Fixes #74628

Pull Request resolved: #75223

Reviewed By: zou3519

Differential Revision: D35410285

Pulled By: wconstab

fbshipit-source-id: bb84d3fb636882cbe7e18af4b35ff2c0e22aaa58
@github-actions
Copy link
Contributor

github-actions bot commented Apr 6, 2022

Hey @antoniojkim.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed open source release notes: lazy release notes category topic: not user facing topic category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Decouple LTC from TS backend

7 participants