New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrong shape inference when convert TF model #335
Comments
@dkurt have you reproduced this issue? thanks |
@ahuizxc , Can you please specify OpenVINO version and attach used |
Already solved :) anyone who encountered this issue, please use the master branch of dldt. |
The last workable commit in dldt master branch is pending at Oct 17 2019, and it seems to not match your comment date Dec 25 2019.
------------------------------------------------------------------------------ So i checkout to 2020.1 branch and check the date is after given, but the issue is still present.
------------------------------------------------------------------------------ Look into the code inside of dldt/model-optimizer/mo/middle/passes/eliminate.py, no changes to your comment. Any idea? Sincerely, |
so you encountered the same problem as i do? |
I am working on another Pytorch model and have the similar error. |
I don't know why, the root cause maybe only the guy write this code know- -, the source code is so complex. Beside, I used to reported this problem to my Russian colleagues and looks like he just an external communication interface. |
I am fear that the converted model is missing something so that lost the original model inference. |
you can use the same data do inference and see if there are some difference between openvino and torch. |
@ahuizxc , there were some fixes related to the Reduce operations in the TF. Can you try the Model Optimizer from master and check if the issue persists? |
@ahuizxc, can you use the latest MO or share the model? |
It seems that the issues is not actual anymore as no response. Closing it. Feel free to reopen it or create a new one. |
hi there is a bug when i trying to convert tensorflow model to IR model:
the tensorflow inference graph code is :
and the descriptor's shape is
so when i trying to convert the model file to IR model, i get an error
[ ERROR ] After partial shape inference were found shape collision for node pred/global_head/vlad/Sum (old shape: [ 1 32 240], new shape: [ 1 240 32])
so I modified the code in
https://github.com/opencv/dldt/blob/fe3f978b98c86eaeed3cbdc280e1ffd0bc50d278/model-optimizer/mo/middle/passes/eliminate.py#L155
to
and the IR model was converted successfully and can be loaded and infered by openvino
with correct outputs.
Although this modify can make convert work, it still a temporary solution.
So I think it may help you guys solve this problem soon :) thanks~~
The text was updated successfully, but these errors were encountered: