-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hello, the Chinese onnx model inference seems not right. #9
Comments
@csukuangfj @EmreOzkose Hello, can u guys help me out on this issue? |
@EmreOzkose |
@jinfagang Could you fix use (Note: You need to make some changes to |
@csukuangfj yes, the repo can decode for English samples, but I didn't test it in Chinese. @jinfagang sorry for late reply, I was mostly AFK for a few days. Can you check models with onnx_check.py? |
thank u all. I will try it |
@EmreOzkose @csukuangfj Hello. I have tried the onnx_pretrained.py`. got error:
Can u make it more consistent since the onnx_pretrained are just have joiner without projector. While in c++ it needs projector. The code can not make me have a e2e inference result. What have I miss here? (I just want make it get final ASR result) |
I got assertion error, when try onnx_check.py:
|
How did you get the onnx models? Are you using the same branch to get the onnx models and |
@jinfagang I will debug with Chinese model today. |
@EmreOzkose thank u! I just using onnx model via same export script. like this:
|
@EmreOzkose hello, does Chines model able to get a right result? |
Please use the latest master. |
HI. I have using this script export wenet model:
this weighs using pytorch inference result OK.
but when using it inference on onnx, I get result:
I checked the tokens seems normal.
What's could be missed?
The text was updated successfully, but these errors were encountered: