Skip to content

Conversation

migeed-z
Copy link
Contributor

@migeed-z migeed-z commented Jul 5, 2022

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jul 5, 2022

🔗 Helpful links

✅ No Failures (0 Pending)

As of commit 14409e5 (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

migeed-z added a commit that referenced this pull request Jul 5, 2022
ghstack-source-id: 1d8a8c1
Pull Request resolved: #80909
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
return [Disj([c1, Disj(c2)])], counter

# return [BinConstraintT(input, output, op_eq),
# BinConstraintT(input, normalized_shape, op_consistency)], counter
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete?

migeed-z added 5 commits July 6, 2022 18:56
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
migeed-z added 2 commits July 11, 2022 15:09
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
@migeed-z
Copy link
Contributor Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Merge failed due to Command git -C /home/runner/actions-runner/_work/pytorch/pytorch cherry-pick -x 45d805906f54bf2a3b18d2c9f9f3937437f64c8c returned non-zero exit code 1

Auto-merging test/fx/test_z3_gradual_types.py
CONFLICT (content): Merge conflict in test/fx/test_z3_gradual_types.py
Auto-merging torch/fx/experimental/migrate_gradual_types/constraint_generator.py
CONFLICT (content): Merge conflict in torch/fx/experimental/migrate_gradual_types/constraint_generator.py
error: could not apply 45d805906f... constraints for arange, size, full
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git cherry-pick --continue".
hint: You can instead skip this commit with "git cherry-pick --skip".
hint: To abort and get back to the state before "git cherry-pick",
hint: run "git cherry-pick --abort".

Raised by https://github.com/pytorch/pytorch/actions/runs/2659862433

- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)` 

[ghstack-poisoned]
@migeed-z
Copy link
Contributor Author

@pytorchbot merge -g

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here

@github-actions
Copy link
Contributor

Hey @migeed-z.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

facebook-github-bot pushed a commit that referenced this pull request Jul 14, 2022
Summary:
- The constraints for ne are the same as the ones for tensor addition
- Constraints for layernorm ensure that the input has the form `(*, d1, ..., dn)` where `d1, ..., dn` are consistent with the normalized_dim of the form `d1', ..., dn'`. Since we are using gradual types, they do not have to be equal. The final result is then equal to the input and of the form `(*, d1, ..., dn)`

Pull Request resolved: #80909
Approved by: https://github.com/jamesr66a

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/27db2750ba3fd524e0a03013bbc1aa6f44165224

Reviewed By: DanilBaibak

Differential Revision: D37847284

Pulled By: migeed-z

fbshipit-source-id: b1d0bf933e272bd879140517ce3dddbeab0dd137
@facebook-github-bot facebook-github-bot deleted the gh/migeed-z/13/head branch July 17, 2022 14:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants