Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Nuphar] improvements in symbolic_shape_infer and model editor #1787

Merged
merged 7 commits into from
Sep 12, 2019

Conversation

ke1337
Copy link
Contributor

@ke1337 ke1337 commented Sep 9, 2019

Description: This change improves symbolic shape inference to support more models
And also fixes potential unaligned buffer crash in onnxruntime_perf_test

Motivation and Context

  • It is required to compute shape info necessary for compilers like Nuphar to execute model efficiently

KeDengMS added 3 commits September 7, 2019 02:48
symbolic shape inference fixes to support more sophiscated models
remove useless code from model_quantizer
@ke1337 ke1337 requested a review from a team as a code owner September 9, 2019 15:26
continue

if in_n.op_type == 'TopK' and len(in_n.input) == 1:
upgrade_topk_op(nf, in_n)
Copy link
Contributor

@yangchen-MS yangchen-MS Sep 9, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be calling upgrade_op instead of upgrade_topk_op? #Closed

Copy link
Contributor Author

@ke1337 ke1337 Sep 9, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes. should be removed.


In reply to: 322348495 [](ancestors = 322348495)

'Cast' : self._infer_Cast,
'CategoryMapper' : self._infer_CategoryMapper,
'Compress' : self._infer_Compress,
'Concat' : self._infer_Concat,
'Conv' : self._infer_Conv,
Copy link
Contributor

@yangchen-MS yangchen-MS Sep 9, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if we try to maintain alphabetic order or not. If we do, Conv might need to be moved below ConstantOfShape.
#Closed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure


In reply to: 322352071 [](ancestors = 322352071)

yangchen-MS
yangchen-MS previously approved these changes Sep 10, 2019
Copy link
Contributor

@yangchen-MS yangchen-MS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

yangchen-MS
yangchen-MS previously approved these changes Sep 10, 2019
Copy link
Contributor

@yangchen-MS yangchen-MS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

if self.verbose_ > 2:
print('Inferencing subgraph of node {} with output({}...): {}'.format(node.name, node.output[0], node.op_type))
assert len(node.input) == len(subgraph.input)
subgraph_implicit_input = set()
Copy link
Contributor

@skottmckay skottmckay Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe a comment calling out that the node inputs are not passed directly to the subgraph would avoid potential confusion about the slightly indirect relationship between these lengths. #Closed

subgraph_implicit_input = set()
for sn in subgraph.node:
subgraph_implicit_input.update([i for i in sn.input if i in self.known_vi_])
tmp_graph = helper.make_graph(list(subgraph.node),
Copy link
Contributor

@skottmckay skottmckay Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A value in the subgraph can shadow an outer scope value. Not sure that this takes that into account as I think it's checking outer scope values first for matches. #ByDesign

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this also need to take into account outputs of nodes in the subgraph? Those will also be considered to shadow an outer scope value.


In reply to: 323093408 [](ancestors = 323093408)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

known_vi_ would override when output name in subgraph shadows implicit input. This behavior would be consistent to CPU provider when subgraph output shadows implicit input. ie.:
float aaa; // implicit input to subgraph
float bbb;
subgraph {
ccc = aaa + bbb; // using implicit input
bbb = Relu(ccc); // bbb in subgraph shadowed implicit input
}


In reply to: 323675066 [](ancestors = 323675066,323093408)

@@ -240,7 +274,34 @@ def _onnx_infer_single_node(self, node):
vi.CopyFrom(self.tmp_mp_.graph.output[i_o])
self.known_vi_[vi.name] = vi

def _get_int_values(self, node):
def _onnx_infer_subgraph(self, node, subgraph):
if self.verbose_ > 2:
Copy link
Contributor

@skottmckay skottmckay Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this will handle an If subgraph (input length to If is 1, but subgraph has 0 explicit inputs) so name may be misleading if someone tried to add If processing in the future. It may work if you remove the assert about the input length for the node matching the subgraph. #Closed

subgraph = get_attribute(node, 'body')
num_scan_inputs = get_attribute(node, 'num_scan_inputs')
scan_input_axes = get_attribute(node, 'scan_input_axes', [0]*num_scan_inputs)
num_scan_states = len(node.input) - num_scan_inputs
Copy link
Contributor

@skottmckay skottmckay Sep 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Negative axis values are supported here and in the output axes. Also there are a lot more ops that explicitly support negative axis following this PR: onnx/onnx#2281 #Closed

Copy link
Contributor

@yangchen-MS yangchen-MS left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@skottmckay skottmckay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@ke1337 ke1337 merged commit 18f7377 into master Sep 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants