New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNX] Do not run 'deduplicate_initializers' when 'keep_initializers_as_inputs' is True #96320
Conversation
…as_inputs' is True [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96320
Note: Links to docs will display an error until the docs builds have been completed. ✅ 1 Unrelated FailureAs of commit 6f59ab3: UNSTABLE - The following job failed but was likely due to flakiness present on trunk and has been marked as unstable:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
…as_inputs' is True ghstack-source-id: 3971e381c11f3291ce1a14f985fc245a6360c391 Pull Request resolved: #96320
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe update docstring for keep_initializers_as_inputs
to let users know of this side effect?
Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@BowenBao can you update the docstring per @thiagocrepaldi's ask and rebase the PR?
@pytorchbot rebase |
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
…itializers_as_inputs' is True" ### Proposal When arg of 'keep_initializers_as_inputs' is True, it's quite possible that parameters are set by initializer of input. Hence we should disable de-duplicate initializer optimization when 'keep_initializers_as_inputs==True'. - [ ] Update doc related to `keep_initializers_as_inputs`. [ghstack-poisoned]
Successfully rebased |
…as_inputs' is True ghstack-source-id: fead9d663757be87f05487c2816f95bf5c78a4eb Pull Request resolved: #96320
…itializers_as_inputs' is True" ### Proposal When arg of 'keep_initializers_as_inputs' is True, it's quite possible that parameters are set by initializer of input. Hence we should disable de-duplicate initializer optimization when 'keep_initializers_as_inputs==True'. - [ ] Update doc related to `keep_initializers_as_inputs`. [ghstack-poisoned]
…as_inputs' is True ghstack-source-id: c0d7d0fb8aee0bc363be8cef28e4698f51ff1623 Pull Request resolved: #96320
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Stack from ghstack (oldest at bottom):
Proposal
When arg of 'keep_initializers_as_inputs' is True, it's quite possible that parameters are set by initializer of input.
Hence we should disable de-duplicate initializer optimization when 'keep_initializers_as_inputs==True'.
keep_initializers_as_inputs
.