-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
jit trace will fail for parameter check if it contains param whose ki… #94032
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/94032
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit ef4d483: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
the jit failure for bloom model whose forward contains “**deprecated_arguments” see https://github.com/huggingface/transformers/blob/main/src/transformers/models/bloom/modeling_bloom.py#L880 the return of [] of this function cause failure in check https://github.com/pytorch/pytorch/blob/master/torch/jit/_trace.py#L1036 |
@yao-matrix @jgong5 please help review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sywangyi do you think you could add a test for the behavior that previously would fail? you can add it in test/jit/test_tracer.py
Hi. @davidberard98 thanks for the review. I have added it. |
@davidberard98 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@sywangyi can you take a closer look at the test failures? I think this change looks okay but please fix the test failures first. also, I recommend you rebase/merge off of the |
adding my thoughts on why this change is reasonable, in case I forget later. First, why I initially thought this might be concerning: we silently allow users to proceed in cases where the function takes Counterpoint (why this change is okay): This PR doesn't change the fact that |
@davidberard98 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually I retract my approval based on the test failures... can you take a look at those?
Hi. @davidberard98 . I check the failure case, one is test_coalesce_reference_cycle_cpu_float64 in test_sparse.py. I run all the test_sparse.py case in my env. all success. and even it does not enter the logic I change, I place a breakpoint there. |
@pytorchbot rebase -s |
Let's see if a rebase fixes the issues. |
@pytorchbot successfully started a rebase job. Check the current status here |
Successfully rebased |
c4b47de
to
a3ab1b0
Compare
@davidberard98 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@pytorchbot rebase -s |
@pytorchbot successfully started a rebase job. Check the current status here |
…nd is _ParameterKind.VAR_KEYWORD
Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
Successfully rebased |
a3ab1b0
to
ef4d483
Compare
@davidberard98 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…nd is _ParameterKind.VAR_KEYWORD