You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to know whether ONNX Runtime has GPU support in Android systems. In my particular example, I would need to run the inference of a PyTorch based Deep Learning model which utilizes transformers on the GPU of an Android system.
Thanks in advance!
The text was updated successfully, but these errors were encountered:
Can you provide some example code to run inference of ort model on GPU for Android? I tried using session options and adding NNAPI to it but I am not aware if my method is correct or not.
val options =OrtSession.SessionOptions()
options.addNnapi()
ortSession = ortEnv?.createSession(onnxModel, options)
Hello,
I would like to know whether ONNX Runtime has GPU support in Android systems. In my particular example, I would need to run the inference of a PyTorch based Deep Learning model which utilizes transformers on the GPU of an Android system.
Thanks in advance!
The text was updated successfully, but these errors were encountered: