-
Notifications
You must be signed in to change notification settings - Fork 811
Enable gpu error #427
Comments
@yangyang-zhang Thanks for the feedback. We will look into this. |
@yangyang-zhang You may install the gpu requirements if you want to run on gpu backend. https://github.com/NervanaSystems/neon/blob/master/gpu_requirements.txt pip install pycuda, scikit-cuda, pytools |
@baojun-nervana I am getting the same error even after installing all pre-req: : error: argument -b/--backend: invalid choice: 'gpu' (choose from 'cpu', 'mkl') |
@armando-fandango Can you try to install the gpu dependencies? pip install pycuda, scikit-cuda, pytool |
@baojun-nervana yes all the gpu dependencies are installed and I think my message above said that "even after installing all pre-req" |
I see that in makefile you are doing this : nvcc neon/backends/util/check_gpu.c > /dev/null 2>&1 && ./a.out && rm a.out && echo true This is always returning an empty HAS_GPU string no matter what. I compiled the check_gpu.c outside of the makefile and it compiles. Is something fishy in above code ? |
It seems you don't have nvcc compiler installed or it is not set in the PATH. make sure "which nvcc" return right path first. |
this command works and produces the a.out : nvcc neon/backends/util/check_gpu.c I am reinstalling the whole nvidia driver and CUDA library to see if there is some problem with that. |
It sounds it should work. Is the PATH set up right too? export PATH="/usr/local/cuda/bin:"$PATH |
After reinstalling CUDA and Neon, now I am getting this error: python3 examples/mnist_mlp.py -b gpu During handling of the above exception, another exception occurred: Traceback (most recent call last): |
after I run this command: sudo chown -R armando.armando ~/.cache Now it works: |
Thank you, I've already solved it. |
I install cuda and environment set,
I don't understand why CPU and GPU install the same package,It should be used when the GPU package is installed by TensorFlow.
The text was updated successfully, but these errors were encountered: