Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run ML model on GPU of an Android run system #6694

Closed
besartgrabanica opened this issue Feb 15, 2021 · 3 comments
Closed

Run ML model on GPU of an Android run system #6694

besartgrabanica opened this issue Feb 15, 2021 · 3 comments
Labels
platform:mobile issues related to ONNX Runtime mobile; typically submitted using template

Comments

@besartgrabanica
Copy link

Hello,

I would like to know whether ONNX Runtime has GPU support in Android systems. In my particular example, I would need to run the inference of a PyTorch based Deep Learning model which utilizes transformers on the GPU of an Android system.

Thanks in advance!

@guoyu-wang
Copy link
Contributor

guoyu-wang commented Feb 27, 2021

We have the NNAPI execution provider which uses CPU/GPU/NPU for model execution on Android.
Please see NNAPI Execution Provider

@sophies927 sophies927 added platform:mobile issues related to ONNX Runtime mobile; typically submitted using template and removed ep:NNAPI labels Aug 12, 2022
@deephudka005
Copy link

deephudka005 commented Nov 16, 2022

Can you provide some example code to run inference of ort model on GPU for Android? I tried using session options and adding NNAPI to it but I am not aware if my method is correct or not.

val options = OrtSession.SessionOptions()
options.addNnapi()
ortSession = ortEnv?.createSession(onnxModel, options)

@guoyu-wang guoyu-wang removed their assignment Dec 12, 2022
@Uwenisme
Copy link

Contributor
Do you mean we do not need to select specific device(cpu, gpu or npu) and NNAPI EP can use all of them automatically?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:mobile issues related to ONNX Runtime mobile; typically submitted using template
Projects
None yet
Development

No branches or pull requests

7 participants