-
Notifications
You must be signed in to change notification settings - Fork 65
mxnet not using the GPU with mxnet-cu90 on Windows #106
Comments
Hi, which Cuda version did you installed? mxnet-cu90 is working fine with cuda9.0, you can try mxnet-cu80 or mxnet-cu91. Also, if you have multiple gpus, could you try model=multi_gpu_model(model, gpus=num_gpus) |
I have a single 1070 GTX GPU. I have cuda 9.0 installed (V9.0.176) and it was working with the TF backend for 9.0 Should I still try cu80 or cu91 ? |
Let me try to reproduce and get back to you. Thanks |
Hi @roboserg, I was not able to reproduce the error, could you provide a minimum reproducible code? How are you building the model? I am testing on a AWS P3.8xLarge instance with 4 GPUs, and I tested a few scripts in example folder: cifar10_cnn, mnist_cnn, and lstm_text_generation Pip packages version:
|
Thanks for checking it out. I tried this example https://github.com/awslabs/keras-apache-mxnet/blob/master/examples/mnist_cnn.py It still utilizes my GPU for 0%-1% and CPU for 60+% so I guess its running CPU only. I am sure its mxnet, since it says "Using MXNet backend" after importing keras. For Keras config I was editing it under C:\Users\Roboserg\ .keras\keras.json I have mxnet-cu90 version 1.2 and keras-mxnet version 2.1.6.1. I am testing on my local Windows 10 machine in Anaconda env. I don't know what else to try. Any help would be appreciated. Is there a way to force mxnet to use the GPU? On the other hand I dont have non GPU versions of mxnet installed, only the GPU one. ps. would you mind to tell how I can print the same table as you? |
Using incubator-mxnet indeed uses my GPU, i.e. running But how do I run mxnet with GPU in the jupyter notebook with Keras, for example using examples/mnist_cnn.py linked above? (from keras-apache-mxnet) |
on Linux it’s ‘nvidia-smi’ On windows run nvidia-smi.exe under your nvidia installation folder. Let me test this on windows. You can specify which device to run by passing a context param during compile, for example in mnist_cnn.py change line 59 to:
we hide this from user as it's for mxnet only and not a Keras API. |
@roboserg - Were you able to train with GPUs? |
yes, see below please |
@roywei thank you very much! Thanks again, was able to train the MNIST example in only 80 seconds on my 1070 GTX! Thats 60000 * 12 images, incredible! |
@roboserg Tested on windows, indeed it does not utilize GPU by default, I think it's a bug, for now just use the context param. Thanks for the catch! |
I was using latest keras with tensorflow with GPU. I installed mxnet in the following way:
pip install keras-mxnet
pip install mxnet-cu90
I changed "backend": "mxnet" in the keras config file. In jupyter I see "Using MXNet backend", but when training only CPU is utilized and not the GPU.
Any advice?
The text was updated successfully, but these errors were encountered: