-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request]: Support ONNX runtime in RunInference API #22972
Comments
cc: @damccorm |
.take-issue |
A first-timer question: I see the integration tests for python sdk pull data from a gs location. How can I read/write to that location? Do I need special permission? |
I think so. To access |
.Thanks - I wrote the API and local unit tests for it. Could you help me review it before I proceed with tests that require remote files? Should I make a draft pr? Thanks. |
Yes, that sounds good. Please refer this issue in the PR and tag me as a reviewer. |
Seems like I cannot add reviewers to draft prs? Here is the PR: #24911 |
I'm closing this since it seems to have been completed, however onnx functionally currently appears untested and potentially broken, let's follow up in #31254. |
What would you like to happen?
ONNX Runtime is one of the famous frame work for ML inference in the real world.
https://onnxruntime.ai/
If RunInference API support the ONNX runtime which cover the various ML framework actually.
e.g. not only PyTorch , scikit-learn and TensorFlow but also Transformers(Hugging face) , XGBoost and so on.
https://onnx.ai/supported-tools.html
Issue Priority
Priority: 2
Issue Component
Component: sdk-py-core
The text was updated successfully, but these errors were encountered: