Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Have you ever tried https://tfhub.dev/google/tfjs-model/movenet/multipose/lightning/1 ? #4

Closed
liamsun2019 opened this issue Dec 17, 2021 · 4 comments

Comments

@liamsun2019
Copy link

Hi, have you ever tried the model:
https://tfhub.dev/google/tfjs-model/movenet/multipose/lightning/1

which is a model in tfjs representation. I convert it to an onnx model using:
python3.6 -m tf2onnx.convert --opset 11 --tfjs model.json --output test.onnx --inputs-as-nchw input

where model.json is one of the extracted files from tfjs-model_movenet_multipose_lightning_1.tar.gz

Then I apply this onnx model to multipose detection, but the inference results are very bad. I compare the results with that of https://tfhub.dev/google/movenet/multipose/lightning/1 and they are pretty different. But I cannot find out the possible reason. Any suggestions? Thanks.

@liamsun2019
Copy link
Author

In addition, I also converted the tf version to onnx:
python3.6 -m tf2onnx.convert --saved-model movenet_multipose_lightning_1 --output model.onnx --opset=11 --inputs-as-nchw input
where movenet_multipose_lightning_1 is the path to which movenet_multipose_lightning_1.tar.gz is uncompressed containing the saved pb model.

The onnx model also performs bad. Looks very strange.

@Kazuhito00
Copy link
Owner

Kazuhito00 commented Dec 20, 2021

This repository does not support tfjs.

Also, even if I convert the tf file to ONNX(NCHW), the accuracy does not seem to deteriorate.
You should check if the inference process is error-free.

Blue:tf Red:ONNX(NCHW)
 

@liamsun2019
Copy link
Author

Thanks for your feedback. In fact, when I convert the tfjs to onnx using tf2onnx.convert with option '--inputs-as-nchw', the generated onnx performs badly(I adjust the input shape to nchw accordingly in my inference code). In contrast, if '--inputs-as-nchw' is not applied, i.e, the input remains nhwc, the onnx performs well(I feed the model with shape of nhwc accordingly). I am curious why this happen.

@Kazuhito00
Copy link
Owner

This repository does not support tfjs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants