Replies: 3 comments 1 reply
-
Tracking this in #3. I'm sick at the moment so this may take a day or two to implement, sorry. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Beta Was this translation helpful? Give feedback.
0 replies
-
great fix. thanks!
while you are at it maybe also add openvino? This is yet another cpu ep.
Not urgent. I can probably add a pr if you prefer.
…On Sat, Dec 31, 2022 at 8:37 AM Carson M. ***@***.***> wrote:
ort v1.13.2 <https://crates.io/crates/ort/1.13.2> has support for
ExecutionProvider::dnnl() among a few other fixes. You'll need to compile
ONNX Runtime from source with oneDNN support and point ort to the
libraries using the system strategy
<https://github.com/pykeio/ort#strategies> (also make sure you enable the
onednn Cargo feature).
—
Reply to this email directly, view it on GitHub
<#2 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAGIV7BGQHVLEY336QYMZLWQBOK7ANCNFSM6AAAAAATNKSHL4>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, first of all great crate!! much cleaner and more comprehensive.
I am ready to use it in one of my critical project.
One thing thought as we are using CPU for serving and our CPUs will benefit greatly from mkl lib.
Onnx has DnnlExecutionProvider that will enable the mkl support but it doesn't seem your crate provide this support..
Also I checked the downloaded onnx lib in my Mac and I don't see the dnnl lib so it looks like it is just using the normal MLAS cpu ep.
Is it possible to add this option?
Thanks so much for the work!
Beta Was this translation helpful? Give feedback.
All reactions