-
Notifications
You must be signed in to change notification settings - Fork 570
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to run inference on cpu? #18
Comments
Hi @justmaulik |
I already tried on colab and also modified some of the code(few lines) where it required CUDA but it's more complex than just a few lines. and I can train it on GPU but I am planning to host inference as API so CPU is the only way the fits my budget. Replacing ninja to support CPU inference is anywhere on the to-do list? :) or you could just point me in a direction what I need to do to run just the inference on CPU. |
If I had to guess, you're probably getting an exception when trying to load the FusedLeakyReLU? |
Thanks @yuval-alaluf |
I tried it to make it work(with beginner knowledge) but I keep getting different errors.
The text was updated successfully, but these errors were encountered: