-
Notifications
You must be signed in to change notification settings - Fork 245
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
paraformer 模型无法使用 coreml provider #902
Comments
其他模型可以么? 有没有试过 int8.onnx ? 你是 macos 么? |
我是 MacOS。
|
这个问题我解决不了,不好意思。 |
谢谢及时反馈! 可能就是 onnx runtime 的 coreml provider 有问题。我找到另一个使用 onnx runtime 推理 paraformer 模型的例子,遇到一样的问题。 |
你自己导出这个模型试试? |
我试试,有导出脚本可参考么,我看需要将原 pytorch 模型拆成 encoder, decoder 后分别导出。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
reproduce:
The text was updated successfully, but these errors were encountered: