-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CPU only inferrence? #6
Comments
Sorry I realised it is unique to the model, but does DensePose have a CPU only mode? |
At the moment, the server can technically perform inference using the CPU, but it depends on the framework and architecture of the particular model. For example, the TensorFlow and Caffe frameworks in the Docker image are the GPU versions, I'm not sure how they'd behave falling back to CPU. However the included OpenCV library will work on CPU. Regarding DensePose and Mask R-CNN, they currently require a GPU for inference. |
Yeah I sort of knew most of what you have told me: For Tensorflow you can use CPU or GPU at will as follows: The issue about CPU only inference of Detectron was logged by me some time ago: |
Many visual effect facilities have large investments in CPU only render farms, is it possible to do inference on a distributed CPU render farm?
The text was updated successfully, but these errors were encountered: