Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CPU only inferrence? #6

Closed
samhodge opened this issue May 25, 2019 · 3 comments
Closed

CPU only inferrence? #6

samhodge opened this issue May 25, 2019 · 3 comments

Comments

@samhodge
Copy link

Many visual effect facilities have large investments in CPU only render farms, is it possible to do inference on a distributed CPU render farm?

@samhodge
Copy link
Author

Sorry I realised it is unique to the model, but does DensePose have a CPU only mode?

@ringdk
Copy link
Contributor

ringdk commented May 27, 2019

At the moment, the server can technically perform inference using the CPU, but it depends on the framework and architecture of the particular model. For example, the TensorFlow and Caffe frameworks in the Docker image are the GPU versions, I'm not sure how they'd behave falling back to CPU. However the included OpenCV library will work on CPU.

Regarding DensePose and Mask R-CNN, they currently require a GPU for inference.

@samhodge
Copy link
Author

Yeah I sort of knew most of what you have told me:

For Tensorflow you can use CPU or GPU at will as follows:
https://www.tensorflow.org/guide/graphs#placing_operations_on_different_devices

The issue about CPU only inference of Detectron was logged by me some time ago:

facebookresearch/Detectron#54

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants