Skip to content

Conversation

wconstab
Copy link
Collaborator

@wconstab wconstab commented Oct 1, 2021

Depends on pytorch PR pytorch/pytorch#66181

@JackCaoG JackCaoG self-requested a review October 1, 2021 03:56
Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TF update seems irrelevant. Is that made by a mistake?

@wconstab
Copy link
Collaborator Author

wconstab commented Oct 1, 2021

yep mistake.

Btw, this isn't ready yet. I was just seeing if I could push. I just got the right bazel version and trying to build locally, but I am sure some things are missing/wrong.

@wconstab wconstab changed the title Wconstab/ltc core hash [wip] Wconstab/ltc core hash Oct 1, 2021
@JackCaoG
Copy link
Collaborator

JackCaoG commented Oct 1, 2021

Sounds good, let me know when it is ready 😄

@wconstab wconstab changed the title [wip] Wconstab/ltc core hash wconstab/ltc core hash Oct 5, 2021
@wconstab wconstab force-pushed the wconstab/ltc_core_hash branch from dda0426 to ad9e44c Compare October 5, 2021 01:09
As part of migration of core lazy tensor functionality to PyTorch core,
this uses the newly added torch::lazy::Hash functions from PyTorch.

Note: while the Hash* functions are largely identical to the original
XLA ones, the underlying uint128 class is from protobuf instead of absl,
since it was a slightly smaller dependency to ingest and get building on
multiple OS/platform combinations for PyTorch.
@wconstab wconstab force-pushed the wconstab/ltc_core_hash branch from ad9e44c to 0c8bd01 Compare October 5, 2021 04:13
@wconstab
Copy link
Collaborator Author

wconstab commented Oct 5, 2021

@JackCaoG OK- this is ready for review now. I am in the process of landing the pytorch/pytorch PR.

Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly LGTM, minor nits. Should we also clean up xla/third_party/xla_client/util.h?

[&]() { return NodeOutputShape(input, base_indices, sizes); },
/*num_outputs=*/1, xla::util::MHash(base_indices, sizes)),
/*num_outputs=*/1,
torch::lazy::MHash(torch::lazy::Hash(base_indices),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why couldn't it just be

torch::lazy::MHash(base_indices, sizes);

here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea, ill take another look here. I think I had trouble with the getting torch/core's MHash to work natively with torch_xla/util's Hash(absl::Span) impl.

I might have to define or declare the absl one in a different way so that the torch one can find it?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found a solution here, which is to define a new MHash template in torch_util.h which specializes on absl::span.

@JackCaoG
Copy link
Collaborator

JackCaoG commented Oct 6, 2021

@wconstab Could you also include your pytorch pr link in the pr description? This will make it easier for us to track this pr in the future.

@wconstab
Copy link
Collaborator Author

wconstab commented Oct 6, 2021

Sure. got reverted already though, some extra windows builds that didn't run on the PR. I'll tag the new pytorch PR here once it's ready.

@wconstab wconstab force-pushed the wconstab/ltc_core_hash branch from 622c095 to b67be8e Compare October 6, 2021 20:55
@wconstab wconstab requested a review from JackCaoG October 7, 2021 00:28
Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@wconstab
Copy link
Collaborator Author

wconstab commented Oct 7, 2021

I'm landing the pytorch changes again now. I'm not going to rush to merge the xla side in case the pytorch side gets reverted. I'll wait a few days then if stable press the button.

Btw what do I do about the .torch_pin file? I guess I can delete it once pytorch/pytorch is merged and then CI will still pass by running against master.

@JackCaoG
Copy link
Collaborator

JackCaoG commented Oct 7, 2021

I'm landing the pytorch changes again now. I'm not going to rush to merge the xla side in case the pytorch side gets reverted. I'll wait a few days then if stable press the button.

Btw what do I do about the .torch_pin file? I guess I can delete it once pytorch/pytorch is merged and then CI will still pass by running against master.

Yea, you can delete the .torch_pin once pytorch side pr merged and rerun the cl. I agree that we don't need to rush to merge this pr.

@wconstab wconstab merged commit 52be55d into master Oct 12, 2021
@wconstab wconstab deleted the wconstab/ltc_core_hash branch October 12, 2021 03:29
seemethere added a commit to pytorch/pytorch that referenced this pull request Nov 23, 2021
…_data"


Looks like these files are getting used by downstream xla (as of pytorch/xla#3148) so we need to
include them in our package_data so that it gets included when we build `bdist_wheel` like in our GHA workflows

Signed-off-by: Eli Uriegas <eliuriegasfb.com>

Differential Revision: [D32622241](https://our.internmc.facebook.com/intern/diff/D32622241)

[ghstack-poisoned]
seemethere added a commit to pytorch/pytorch that referenced this pull request Nov 23, 2021
Looks like these files are getting used by downstream xla (as of pytorch/xla#3148) so we need to
include them in our package_data so that it gets included when we build `bdist_wheel` like in our GHA workflows

Signed-off-by: Eli Uriegas <eliuriegasfb.com>

Differential Revision: [D32622241](https://our.internmc.facebook.com/intern/diff/D32622241)

[ghstack-poisoned]
seemethere added a commit to pytorch/pytorch that referenced this pull request Nov 29, 2021
…_data"


Looks like these files are getting used by downstream xla (as of pytorch/xla#3148) so we need to
include them in our package_data so that it gets included when we build `bdist_wheel` like in our GHA workflows

Signed-off-by: Eli Uriegas <eliuriegasfb.com>

Differential Revision: [D32622241](https://our.internmc.facebook.com/intern/diff/D32622241)

[ghstack-poisoned]
seemethere added a commit to pytorch/pytorch that referenced this pull request Nov 29, 2021
Looks like these files are getting used by downstream xla (as of pytorch/xla#3148) so we need to
include them in our package_data so that it gets included when we build `bdist_wheel` like in our GHA workflows

Signed-off-by: Eli Uriegas <eliuriegasfb.com>

Differential Revision: [D32622241](https://our.internmc.facebook.com/intern/diff/D32622241)

[ghstack-poisoned]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants