Skip to content

Conversation

yoyoyocmu
Copy link
Contributor

@yoyoyocmu yoyoyocmu commented Jan 17, 2024

Summary:
This change adds support hasattr support for TupleVariable in dynamo.

This fix is part of: #117670

Test Plan: Unit test and CI

Differential Revision: D52850665

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @aakhundov @kadeng

Copy link

pytorch-bot bot commented Jan 17, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/117694

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (6 Unrelated Failures)

As of commit e555fb5 with merge base 84cfe6d (image):

FLAKY - The following job failed but was likely due to flakiness present on trunk:

BROKEN TRUNK - The following jobs failed but was present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D52850665

Summary:

This change adds support hasattr support for TupleVariable in dynamo.

This fix is part of: pytorch#117670

Test Plan: Unit test and CI

Differential Revision: D52850665
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D52850665

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jan 18, 2024
@facebook-github-bot
Copy link
Contributor

@pytorchbot merge -f 'Landed internally'

(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally)

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@kkkkk-ux
Copy link

Is this fix finished? I still got torch._dynamo.exc.Unsupported: hasattr: TupleVariable() error when using dynamo export to export opt-125m model under gptq quantization algorithm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants