Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrong shape inference when convert TF model #335

Closed
ahuizxc opened this issue Nov 28, 2019 · 12 comments
Closed

Wrong shape inference when convert TF model #335

ahuizxc opened this issue Nov 28, 2019 · 12 comments
Assignees
Labels
question Further information is requested

Comments

@ahuizxc
Copy link

ahuizxc commented Nov 28, 2019

hi there is a bug when i trying to convert tensorflow model to IR model:
the tensorflow inference graph code is :

descriptor = tf.reduce_sum(residuals, axis=[1, 2])

and the descriptor's shape is

(Pdb) descriptor
<tf.Tensor 'pred/global_head/vlad/Sum:0' shape=(1, 32, 240) dtype=float32>

so when i trying to convert the model file to IR model, i get an error

[ ERROR ]  After partial shape inference were found shape collision for node pred/global_head/vlad/Sum (old shape: [  1  32 240], new shape: [  1 240  32])

so I modified the code in

https://github.com/opencv/dldt/blob/fe3f978b98c86eaeed3cbdc280e1ffd0bc50d278/model-optimizer/mo/middle/passes/eliminate.py#L155

def shape_inference(graph: Graph):
    for node in graph.pseudo_topological_sort():
        if node.has_and_set('need_shape_inference'):
            old_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
            node.infer(node)
            new_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
            if not node.has_and_set('override_output_shape'):
                for shape1, shape2 in zip(old_out_shapes, new_out_shapes):
                    if shape1 is not None and not np.array_equal(shape1, shape2):
                        raise Error("After partial shape inference were found shape collision for node {} (old shape: "
                                    "{}, new shape: {})".format(node.name, shape1, shape2))
            else:
                del node['override_output_shape']
            node.need_shape_inference = False

to


def shape_inference(graph: Graph):
    for node in graph.pseudo_topological_sort():
        if node.has_and_set('need_shape_inference'):
            old_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
            node.infer(node)
            new_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
            for i in range(len(new_out_shapes)):
                if (new_out_shapes is not None) and (np.array_equal(new_out_shapes[i], old_out_shapes[i])):
                    try:
                        node.out_ports().values()[i].data.set_shape(old_out_shapes[i])
                    except:
                        pass

            # if not node.has_and_set('override_output_shape'):
            #     for shape1, shape2 in zip(old_out_shapes, new_out_shapes):
            #         if shape1 is not None and not np.array_equal(shape1, shape2):
            #             # pass
            #             # pdb.set_trace()
            #             raise Error("After partial shape inference were found shape collision for node {} (old shape: "
            #                         "{}, new shape: {})".format(node.name, shape1, shape2))
            # else:
            #     del node['override_output_shape']
            node.need_shape_inference = False

and the IR model was converted successfully and can be loaded and infered by openvino
with correct outputs.
Although this modify can make convert work, it still a temporary solution.
So I think it may help you guys solve this problem soon :) thanks~~

@ahuizxc
Copy link
Author

ahuizxc commented Dec 16, 2019

@dkurt have you reproduced this issue? thanks

@dkurt
Copy link
Contributor

dkurt commented Dec 16, 2019

@ahuizxc , Can you please specify OpenVINO version and attach used .pb file and MO command line? Thanks!

@ahuizxc
Copy link
Author

ahuizxc commented Dec 25, 2019

Already solved :) anyone who encountered this issue, please use the master branch of dldt.

@Pangwei418
Copy link

Pangwei418 commented Apr 10, 2020

@ahuizxc

The last workable commit in dldt master branch is pending at Oct 17 2019, and it seems to not match your comment date Dec 25 2019.
-----------------------------------------------------------------------------
commit 94aed08
Author: Alexey Suhov alexey.suhov@intel.com
Date: Thu Oct 17 17:14:30 2019 +0300

updated readme file due to moving CMake scripts to the root folder

------------------------------------------------------------------------------

So i checkout to 2020.1 branch and check the date is after given, but the issue is still present.
------------------------------------------------------------------------------
commit b2140c0 (HEAD, tag: 2020.1)
Author: Alexey Suhov alexey.suhov@intel.com
Date: Tue Feb 11 22:48:49 2020 +0300

Publishing 2020.1 content

------------------------------------------------------------------------------

Look into the code inside of dldt/model-optimizer/mo/middle/passes/eliminate.py, no changes to your comment.
------------------------------------------------------------------------------
def shape_inference(graph):
for node in graph.pseudo_topological_sort():
if node.has_and_set('need_shape_inference'):
old_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
node.infer(node)
new_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
if not node.has_and_set('override_output_shape'):
for shape1, shape2 in zip(old_out_shapes, new_out_shapes):
if shape1 is not None and not np.array_equal(shape1, shape2):
raise Error("After partial shape inference were found shape collision for node {} (old shape: "
"{}, new shape: {})".format(node.name, shape1, shape2))
else:
del node['override_output_shape']
node.need_shape_inference = False
------------------------------------------------------------------------------

Any idea?

Sincerely,
Ben.

@ahuizxc
Copy link
Author

ahuizxc commented Apr 13, 2020

@ahuizxc

The last workable commit in dldt master branch is pending at Oct 17 2019, and it seems to not match your comment date Dec 25 2019.
-----------------------------------------------------------------------------
commit 94aed08
Author: Alexey Suhov alexey.suhov@intel.com
Date: Thu Oct 17 17:14:30 2019 +0300

updated readme file due to moving CMake scripts to the root folder

------------------------------------------------------------------------------

So i checkout to 2020.1 branch and check the date is after given, but the issue is still present.
------------------------------------------------------------------------------
commit b2140c0 (HEAD, tag: 2020.1)
Author: Alexey Suhov alexey.suhov@intel.com
Date: Tue Feb 11 22:48:49 2020 +0300

Publishing 2020.1 content

------------------------------------------------------------------------------

Look into the code inside of dldt/model-optimizer/mo/middle/passes/eliminate.py, no changes to your comment.
------------------------------------------------------------------------------
def shape_inference(graph):
for node in graph.pseudo_topological_sort():
if node.has_and_set('need_shape_inference'):
old_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
node.infer(node)
new_out_shapes = [port.data.get_shape() for port in node.out_ports().values() if not port.disconnected()]
if not node.has_and_set('override_output_shape'):
for shape1, shape2 in zip(old_out_shapes, new_out_shapes):
if shape1 is not None and not np.array_equal(shape1, shape2):
raise Error("After partial shape inference were found shape collision for node {} (old shape: "
"{}, new shape: {})".format(node.name, shape1, shape2))
else:
del node['override_output_shape']
node.need_shape_inference = False
------------------------------------------------------------------------------

Any idea?

Sincerely,
Ben.

so you encountered the same problem as i do?
Have you tried to use my solution?
thanks :)

@Pangwei418
Copy link

Pangwei418 commented Apr 13, 2020

I am working on another Pytorch model and have the similar error.
Your changes seem to pass the error and keep all overridden shape inference (new_out_shapes). After that, it works but, i really want to know the root cause why the shape inference occurs conflict not just workaround it.

@ahuizxc
Copy link
Author

ahuizxc commented Apr 14, 2020

overridden

I don't know why, the root cause maybe only the guy write this code know- -, the source code is so complex. Beside, I used to reported this problem to my Russian colleagues and looks like he just an external communication interface.

@Pangwei418
Copy link

I am fear that the converted model is missing something so that lost the original model inference.

@ahuizxc
Copy link
Author

ahuizxc commented Apr 14, 2020

I am fear that the converted model is missing something so that lost the original model inference.

you can use the same data do inference and see if there are some difference between openvino and torch.

@lazarevevgeny
Copy link
Contributor

@ahuizxc , there were some fixes related to the Reduce operations in the TF. Can you try the Model Optimizer from master and check if the issue persists?

@lazarevevgeny lazarevevgeny added the question Further information is requested label May 25, 2020
@lazarevevgeny
Copy link
Contributor

@ahuizxc, can you use the latest MO or share the model?

@AnastasiaKazantaeva
Copy link
Contributor

It seems that the issues is not actual anymore as no response. Closing it. Feel free to reopen it or create a new one.

redradist pushed a commit to redradist/openvino that referenced this issue Oct 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

5 participants