Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference on GPU #25

Open
drvshavva opened this issue Nov 10, 2021 · 4 comments
Open

Inference on GPU #25

drvshavva opened this issue Nov 10, 2021 · 4 comments

Comments

@drvshavva
Copy link

Hi, Can I build and use libonnx on gpu ?

@jianjunjiang
Copy link
Member

Need write a struct onnx_resolver_t, and the pass to onnx_context_alloc_from_file function

@drvshavva
Copy link
Author

thanks for answer :)

@drvshavva drvshavva reopened this Nov 17, 2021
@drvshavva
Copy link
Author

hi again, you are saying that we need to write a struct onnx_resolver_t to infer on the gpu. Is it enough ? Is there any other need? And finally how do I change this struct onnx_resolver_t, what structure, what should I do? Many thanks in advance for all the replies :)

@jianjunjiang
Copy link
Member

Writing a solver requires a lot of work, you can refer to the default version of the solver implementation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants