Skip to content

Commit

Permalink
Update on "[ONNX] Do not run 'deduplicate_initializers' when 'keep_in…
Browse files Browse the repository at this point in the history
…itializers_as_inputs' is True"


### Proposal
When arg of 'keep_initializers_as_inputs' is True, it's quite possible that parameters are set by initializer of input.
Hence we should disable de-duplicate initializer optimization when 'keep_initializers_as_inputs==True'.

- [ ] Update doc related to `keep_initializers_as_inputs`.

[ghstack-poisoned]
  • Loading branch information
BowenBao committed Aug 1, 2023
2 parents e4bb10d + ae1a93f commit 6f59ab3
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions torch/onnx/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -454,6 +454,11 @@ def forward(self, x):
This may allow for better optimizations (e.g. constant folding) by
backends/runtimes.
If True, `deduplicate_initializers` pass will not be executed. This means
initializers with duplicated values will not be deduplicated and
will be treated as distinct inputs to the graph. This allows different
input initializers to be supplied at the runtime following export.
If ``opset_version < 9``, initializers MUST be part of graph
inputs and this argument will be ignored and the behavior will be
equivalent to setting this argument to True.
Expand Down

0 comments on commit 6f59ab3

Please sign in to comment.