New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'Wrong wire type in tag' error when trying to convert .pb to ONNX #1038
Comments
This looks like a saved-model format. So you'd do you can:
That loads but we throw some error while processing a loop:
We'll take a look. |
Hi @guschmue , I couldn't reproduce your error. I tried this code Python 3.8.5, tensorflow Version: 2.3.0 - Error: 'ValueError: Failed to import metagraph, check error log for more info..'
2nd scenario: Python 3.7.7, tensorflow version: 1.13.2 - Error: 'KeyError: 'FusedBatchNormV3'
I guess Google Vision output the model using TF last version (2.3 to this day) |
FusedBatchNormV3 came in after tf-1.13 and it can't load this model. tf-1.14 or tf-1.15 should work.
but hit this issue |
I'm getting a different error, am I missing a library?
|
I had a cpu build installed. If I use a gpu I see the same error that you see. Odd |
Do you mean you a GPU build installed? I do not have a GPU... |
|
I see...how can we solve this? Any work-around? |
Hi, any update on this? |
I am facing completely same problem too. |
@csl-8bit looking at this now. Which of the above errors are you currently getting? |
@csl-8bit can you upload a model you are trying to convert? I'm unable to open the saved model at the top of this issue. |
I trained the object detection model using Google AutoML. System information: I tried this code
|
Interesting. This definitely is in saved model format but it has no meta graph. Also it uses the DecodeJpeg op which we don't support yet. |
@csl-8bit Can you upload the tflite version as well? |
I think the name is saved_model but its actually a graphdef ... use |
I thought that too at first but it is actually a saved model according to Netron. |
yes, I get a decode error. wired. |
I have uploaded both tflite and tfjs version |
Hi @TomWildenhain-Microsoft , any upgrade on DecodeJpeg op yet? |
Yes, while DecodeJpeg isn't supported, the tflite version of the model accepts a tensor of pixel values directly and looks easy to convert. We plan on adding support for converting tflite models next semester. When we do, your model should be able to be converted. |
The way you can work around it is to pass in an explicit input that goes behind the DecodeJpeg. |
@guschmue Do we have the code that does that for saved models merged? |
I can send a PR today. |
@guschmue thanks, can you please post here when the PR processes successfully? I really want to be able to use the workaround you mentioned. |
hey bro, have you got anything?!! |
I had spoken to early - to override of inputs creates some issue. Thinking how to get around this. |
You should now be able to convert the tflite model using the --tflite arg. You will need opset 13 and the latest ORT nightly build. |
Thanks @TomWildenhain-Microsoft . I'm getting an error trying to run this:
Should I use another github branch? For the saved-model argument should I use the folder containing the saved_model.pb file or should I use the file path? I guess I would have to use the model.tflite file path, am I right? Could you please post the code to be used? Thanks |
Use python -m tf2onnx.convert --opset 13 --tflite "path/to/model.tflite" --output "converted.onnx" This feature is beta so let me know if it works. |
I'm still getting the error:
What can I do? |
Trying of another tensorflow version might be useful to you. It's work in my case i've used tensorflow 2.5.0. |
@theumang100 how did you install it? I'm trying this:
|
try pip install tf-nightly==2.5.0.dev20210203 or pip install tensorflow==2.4.1 |
You wan the latest version of tf2onnx I think: |
Thanks, uninstalling tensorflow and running Does anyone know how to use this model on live video to recognize objects? I would like to send the object's coordinates to a microcontroller using bluetooth or serial port. |
To use the model on video, you should be able to:
|
tf2onnx now also supports "graph cutting". If you specify inputs that are nodes in the middle of the graph, tf2onnx will promote them to inputs and remove the nodes before them. |
@TomWildenhain-Microsoft can you please provide an example? |
|
You will need to install the latest tf2onnx from master. |
@dfvr1994 did this work for you? |
@TomWildenhain-Microsoft , can I specify the output opset? Matlab is not able to read opset 13 |
Sure, use |
@TomWildenhain-Microsoft thank you so much, dequatizing the model and setting opset to 9 seems to work. |
Quick question, Matlab doesn't seem to like certain layers. At the moment of converting the model can I set which layers to exclude or be replaced with others? |
Hello, does anyone figure out how to convert saved model (.pb - AutoML version) to onnx yet? I still run into this issue
|
@tnaduc are you able to get a tflite version of the model? |
@TomWildenhain-Microsoft No, I haven't tried tflite, I only tried the savedmodel (.pb file). |
@tnaduc If you have access to a tflite version of the model, I would recommend trying it. Let me know if it doesn't work. |
@TomWildenhain-Microsoft thanks Tom, I will follow your recommendation and let you know. |
Describe the bug
I'm using the following command to convert the frozen pb model to ONNX with no success
python -m tf2onnx.convert --graphdef saved_model.pb --output frozen.onnx --fold_const --opset 10 --inputs image_tensor:0 --outputs num_detections:0,detection_boxes:0,detection_scores:0,detection_classes:0
System information
Model attached
saved_model.zip
I do not have the chance to obtain the model in the saved model format. I created it using Google's AutoML for object detection, which lets me output the model in .tflite (including dict.txt and json metadata), .pb (alone), and tensorflow.js (3 bins, dict.txt and model.json) formats.
Error
The text was updated successfully, but these errors were encountered: