-
Notifications
You must be signed in to change notification settings - Fork 611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When I run the quantized version of kokoro using the Python API, an error occurred. #1923
Comments
please follow to update the onnxruntime in sherpa-onnx. currently, we use onnxruntime 1.17.1 |
when I update the onnxruntime, the error also exist. |
which version of onnxruntime are you using? please also describe how you update it. |
I hope you follow my first comment to update onnxruntime in sherpa-onnx. |
the quantized of kokoro is here: https://huggingface.co/onnx-community/Kokoro-82M-v1.0-ONNX/blob/main/onnx/model_quantized.onnx
the sherpa-onnx as follow:
sherpa-onnx 1.10.45
use the command:
it cause the error as follow:
2025-02-26 14:00:50,567 INFO [offline-tts-play.py:513] Loading model ... Traceback (most recent call last): File "/home/lyg/Codes/sherpa-onnx/python-api-examples/offline-tts-play.py", line 576, in <module> main() File "/home/lyg/Codes/sherpa-onnx/python-api-examples/offline-tts-play.py", line 514, in main tts = sherpa_onnx.OfflineTts(tts_config) RuntimeError: Failed to load model with error: /shared/onnxruntime/core/graph/model_load_utils.h:46 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::basic_string<char>, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only *guarantees* support for models stamped with official released onnx opset versions. Opset 5 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 4.
thank you!
The text was updated successfully, but these errors were encountered: