Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU not accesssible for running tensorflow and installing CUDA #1788

Closed
samimoftheworld opened this issue Mar 19, 2017 · 53 comments
Closed

GPU not accesssible for running tensorflow and installing CUDA #1788

samimoftheworld opened this issue Mar 19, 2017 · 53 comments

Comments

@samimoftheworld
Copy link

I tried running tensorflow with gpu but i wanted to install cuda as I have a graphic card present but it always says no graphic driver found .
Please help I need to run some python codes for my ML class

@aseering
Copy link
Contributor

aseering commented Mar 19, 2017

Hi @samimoftheworld -- unfortunately, CUDA currently cannot be made to work with WSL. A future release might someday support it, but not currently. This is discussed on several existing tickets. Try searching this ticket system for "cuda" and reading through the first (oldest -- people have been requesting and discussing this for a long time) few tickets that you find, if you want to understand more about this.

@samimoftheworld
Copy link
Author

samimoftheworld commented Mar 20, 2017

A future release might someday support it, but not currently.

I hope this happens soon 👍

@Rotzbua
Copy link

Rotzbua commented Mar 20, 2017

@samimoftheworld Here you can upvote: https://wpdev.uservoice.com/forums/266908-command-prompt-console-bash-on-ubuntu-on-windo/suggestions/16108045-opencl-cuda-gpu-support

@fpopineau
Copy link

Tensorflow is available for Windows with or without gpu support:
https://www.tensorflow.org/install/install_windows

@samimoftheworld
Copy link
Author

@fpopineau yes tensorflow is available for windows but with the gpu it doesnt work I have tried it. And without GPU training a neural net is too slow hence I was looking for the GPU support in WSL as its practically linux on windows but it has no hardware access I guess now . And @Rotzbua I upvoted that but it seems its the most upvoted feature that the users want but the devs seem to do nothing about it ...... Lets just hope next update brings the GPU support 👍

@fpopineau
Copy link

@samimoftheworld I don't deny the importance of getting gpu support in WSL, but TF does work with GPU in Windows. A quick test on my laptop:

2017-03-28 10:32:53.513855: I c:\tf_jenkins\home\workspace\nightly-win\device\gpu\os\windows\tensorflow\core\common_runtime\gpu\gpu_device.cc:887] Found device 0 with properties:
name: Quadro K1100M
major: 3 minor: 0 memoryClockRate (GHz) 0.7055
pciBusID 0000:02:00.0
Total memory: 2.00GiB
Free memory: 1.67GiB
2017-03-28 10:32:53.514049: I c:\tf_jenkins\home\workspace\nightly-win\device\gpu\os\windows\tensorflow\core\common_runtime\gpu\gpu_device.cc:908] DMA: 0
2017-03-28 10:32:53.517198: I c:\tf_jenkins\home\workspace\nightly-win\device\gpu\os\windows\tensorflow\core\common_runtime\gpu\gpu_device.cc:918] 0:   Y
2017-03-28 10:32:53.517950: I c:\tf_jenkins\home\workspace\nightly-win\device\gpu\os\windows\tensorflow\core\common_runtime\gpu\gpu_device.cc:977] Creating TensorFlow device (/gpu:0) -> (device: 0, name: Quadro K1100M, pci bus id: 0000:02:00.0)
WARNING:tensorflow:Passing a `GraphDef` to the SummaryWriter is deprecated. Pass a `Graph` object instead, such as `sess.graph`.
Iteration: 0001 cost= 30.943873109
Iteration: 0003 cost= 21.104235386
Iteration: 0005 cost= 20.133856106
Iteration: 0007 cost= 19.718502910
Iteration: 0009 cost= 19.381666216`

@samimoftheworld
Copy link
Author

@fpopineau then you must be using python 3.5 cause it surely doesn't work with python 2.7

@zjjott
Copy link

zjjott commented Jun 12, 2017

+1

@majidaldo
Copy link

MS wants to sell you win10 pro so you can get hyper-v so you can run any o/s w/ access to the h/w. the more useful WSL is the less incentive for getting win10 pro.

@hhoeflin
Copy link

hhoeflin commented Aug 5, 2017

@majidaldo

MS wants to sell you win10 pro so you can get hyper-v so you can run any o/s w/ access to the h/w. the more useful WSL is the less incentive for getting win10 pro.

Microsoft has done truly impressive work with WSL. I am using it and am very happy with it. This is one of the most exciting developments for me in recent years for Linux. Therefore I wanted to ask you to please refrain from speculation about what you perceive as Microsoft's interests and motifs, at least on here.

And I agree, having this feature would be really great.

@ghost
Copy link

ghost commented Dec 19, 2017

I am looking forward to it because the AMD linux driver is a piece of shit. So, there are two options when you want to use AMD cards for deep learning:

  1. Choose Linux, suffers from the AMD shit driver. It doesn't work.
  2. Choose Windows, suffers from no tensorflow available.

If Windows can support OpenCL on WSL, well, that's fantastic.

@ed-alertedh
Copy link

Sorry for resurrecting a dead thread, but @majidaldo is wrong. Only Windows server supports "Discrete Device Assignment" in Hyper-V AKA PCI-passthrough.

I'd sooner switch to a Linux hypervisor with Windows guest than try to run Windows Server as my everyday desktop environment...

https://social.technet.microsoft.com/Forums/windows/en-US/b9e21b8f-8774-49c2-b499-b2b8ff2a41a2/hyperv-windows-10-pci-passthrough?forum=win10itprovirt

@idiomer
Copy link

idiomer commented Jun 8, 2018

Does WSL, currently, support tensorflow-gpu?

@ed-alertedh
Copy link

@idiomer No, CUDA and OpenCL are still not supported in WSL. I suggest you go here and upvote the feature request: https://wpdev.uservoice.com/forums/266908-command-prompt-console-bash-on-ubuntu-on-windo/suggestions/16108045-opencl-cuda-gpu-support

@AmoghM
Copy link

AmoghM commented Oct 23, 2019

It has been 2 years since this issue has been opened and we still don't have this critical feature?

@samimoftheworld
Copy link
Author

Yeah, I just think the team has given up on this issue. I believe it's just better to dual boot and start doing your experiments. Or as an alternative try using any of the cloud services like AWS sagemaker or Google Cloud Platform also there is a really nice initiative by Google https://colab.research.google.com if you want to do ML projects. But I believe this feature would never be made and the issue will never be closed.

@fpopineau
Copy link

fpopineau commented Oct 23, 2019 via email

@samimoftheworld
Copy link
Author

Wouldn't WSL2 solve this problem ? I expect that this feature may come more naturally in WSL2. Any insight ? Le mer. 23 oct. 2019 à 11:15, Samim Ekram notifications@github.com a écrit :

Yeah, I just think the team has given up on this issue. I believe it's just better to dual boot and start doing your experiments. Or as an alternative try using any of the cloud services like AWS sagemaker or Google Cloud Platform also there is a really nice initiative by Google https://colab.research.google.com if you want to do ML projects. But I believe this feature would never be made and the issue will never be closed. — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#1788?email_source=notifications&email_token=AAOTV36P5VWHX23QMVQV6SDQQAI2HA5CNFSM4DEHHBB2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOECAWDEI#issuecomment-545350033>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOTV32645N6YYELBSO4YT3QQAI2HANCNFSM4DEHHBBQ .

I wish but look at their official WSL 2 FAQ https://docs.microsoft.com/en-us/windows/wsl/wsl2-faq#can-i-access-the-gpu-in-wsl-2-are-there-plans-to-increase-hardware-support . They just clearly won't do it

@AmoghM
Copy link

AmoghM commented Oct 23, 2019

@samimoftheworld Thanks for the info. It is so unfortunate. They are losing an entire ML community. Ugh.

@ed-alertedh
Copy link

I bet the real issue is they don't want to put Hyper-V device passthrough in the non-server edition of Windows because that is an "enterprise feature". Cmon though, if that was the issue surely you could just enable it for WSL 2 without allowing other Hyper-V machines to use those features. People would probably even be willing to pay for it TBH.

@samimoftheworld
Copy link
Author

@AmoghM yeah I know but I just guess microsoft doesn't want to invest in the ML community as a matter of fact you can see Google doing way more and hence they are getting more revenue as well.

@ed-alertedh yeah I just think the WSL team is just not funded enough to research this. I have just shifted to Ubuntu completely as most of the things like Word, PPT etc in Windows is just available by Google Cloud Suite for free so I don't see a need for using Windows anymore. Frankly I beleved that integrating open source could save the OS and attract more developers, but clearly they don't care so it's just better to move on to Linux. Everything that windows were popular for is now shifting to cloud so the shift would be easy now.

@danvargg
Copy link

+1

@Luxcium
Copy link

Luxcium commented Jan 7, 2020

I am lost here (using vscode and installed python on wsl using Anaconda) I would like to have GPU support on my windows server 2019 and I understand it can't work with WSL but where should I find support to use it on my window server ... (I think I am lost because my Python is WSL installed and my Cuda is windows installed) if someone from the future sees this (later today or in 3 years from now) can you help me by pointing me to the right resource even if this post is not constructive help for this issue ... (?)

@Sparh4wk
Copy link

Sparh4wk commented Jan 23, 2020

srsly? 3 years and we still dont have GPU support even in WSL2? cmon guys... wsl is such amazing thing, but no gpu support is big no go.. its the only thing, that keeping me with my dual boot atm :(

@maedoc
Copy link

maedoc commented Jan 31, 2020

I came here expecting a technical discussion about the problem, not a pile of self-serving complaints. Setting up CUDA & TF GPU on Win 10 is a lot easier and stabler than GPU pass through with KVM on Linux (I've done both), so it isn't surprising that it's not supported even in WSL2, and if your host OS is Win 10 you should just use the native driver + tf-gpu stack.

@samimoftheworld
Copy link
Author

I came here expecting a technical discussion about the problem, not a pile of self-serving complaints. Setting up CUDA & TF GPU on Win 10 is a lot easier and stabler than GPU pass through with KVM on Linux (I've done both), so it isn't surprising that it's not supported even in WSL2, and if your host OS is Win 10 you should just use the native driver + tf-gpu stack.

Well this is technically for reporting issues and asking for feature requests. CUDA and TF on win 10 can be useful for some but it doesn't work properly I have used it and I know how bad it performs. Also, it's never about just tensorflow there are literally so many other dependencies built on top of the core ML project that has to work on top of it. Docker containers to be initialised to use the resources accordingly. Now if you see docker it literally can use the GPU with it's config so it's rather wrong to say it's unstable to use GPU with KVM's.

@Jasha10
Copy link

Jasha10 commented Feb 9, 2020

Surprised that nobody has mentioned this:

  1. Install Python + Tensorflow + CUDA support for Windows
  2. Open the WSL command line
  3. Type python.exe at the WSL command line

Obviously this does not address the root of the problem, but it works well-enough for many use cases. You get native (windows) support for CUDA with the linux command-line experience.

If you install Anaconda for Windows, you might want to enable the "add Anaconda to the Windows PATH" option in the installer -- this should make it easier to access the conda.exe executable from your WSL terminal.

@onacrame
Copy link

onacrame commented Feb 9, 2020

I think the overarching point is not using tensorflow in WSL per se but gpu access. Tensorflow or any other deep learning package is not as useful without gpu access which you absolutely need for for computer vision or natural language processing (if you want to train your model in a reasonable time). The use case is a strong one. It avoids having to deal with dual boot. Also various other packages like Nvidia’s cuDf which allows data processing on the GPU only work in Linux.

Some companies are Windows only environments (like mine).

So it’s not just a matter of “whining” like some bright bulb above suggested. It’s a valid feature request and currently is supported via other routes like VMWare ESXi provided you are running certain designated GPUs so it’s technically very possible to do albeit it requires a fair amount of development.

@ed-alertedh
Copy link

@Jasha10 last time I checked there is no translation between Linux paths and Windows paths so trying to run the windows python stack from WSL ends up being more trouble than it's worth IMO. I haven't tried WSL 2 yet though so I'm happy to be wrong on that.

@sentura23
Copy link

PCI Passthrough is absolutely essential to bring WSL 2 into production deployable state. GPU Passthrough is supported with Hyper-V to an assigned VM so it is unclear why this is not supported with WSL 2 translation VM. Yes, a user might need to manually go through the PowerShell steps of reassigning the particular PCI device to the WSL 2 translation VM. Ideally this could be an advanced user feature requiring the manual steps (phase 1) until WSL2 team sorts out how to make is easy like KVM virt-manager (phase 2).

@turowicz
Copy link

turowicz commented Mar 2, 2020

Guys there are plenty of AI companies such as mine, Surveily, that deploy their solutions to Linux Servers therefore there needs to be a GPU support in WSL2 if you want the developers to use Windows as their main system.

We are currently moving away from Windows just because of that.

@vivian-ng
Copy link

WSL sounded so promising, yet not being able to access the GPU after so many years just shows that it is fundamentally broken. I really hoped to be able to use a single OS for all my needs, but I guess that OS will be Linux since Windows cannot meet this very fundamental one.

@turowicz
Copy link

@vivian-ng what we need is debian based Winux :D

@haimat
Copy link

haimat commented Apr 7, 2020

WSL sounded so promising, yet not being able to access the GPU after so many years just shows that it is fundamentally broken. I really hoped to be able to use a single OS for all my needs, but I guess that OS will be Linux since Windows cannot meet this very fundamental one.

I am not sure whether it is broken or the lack of CUDA support in WSL is by design.
Just no clue yet why that might be ...

It's a pitty though, WSL+CUDA under Windows would come close to a perfect OS.

@antonywu
Copy link

WSL sounded so promising, yet not being able to access the GPU after so many years just shows that it is fundamentally broken. I really hoped to be able to use a single OS for all my needs, but I guess that OS will be Linux since Windows cannot meet this very fundamental one.

I am not sure whether it is broken or the lack of CUDA support in WSL is by design.
Just no clue yet why that might be ...

It's a pitty though, WSL+CUDA under Windows would come close to a perfect OS.

WSL1 is essentially a translation layer from Linux system calls to Windows NT kernel.
WSL2 is an actual Linux kernel running on Hyper-V. There is a high hope that with an actual Linux kernel, Microsoft can finally add the GPU support to it. However, it is unclear on the performance impact from Hyper-V

@jhofker
Copy link

jhofker commented May 19, 2020

🎉 CUDA and DirectML support coming in the next few months

@samimoftheworld
Copy link
Author

🎉 CUDA and DirectML support coming in the next few months

Finally !!! It only took 3 years for this issue thread to finally be addressed 😜😜 All things apart WSL 2 looks really good though. There is also like a very well integrated GUI also coming to [WSL] (https://www.neowin.net/news/the-windows-subsystem-for-linux-is-getting-gpu-and-gui-support) Also I think GPU access will here in the next Major Windows update as well (https://devblogs.microsoft.com/commandline/wsl2-will-be-generally-available-in-windows-10-version-2004/)

@turowicz
Copy link

https://developer.nvidia.com/cuda/wsl Official!

@AceHack
Copy link

AceHack commented May 19, 2020

”Additionally, you can even run pre-built framework containers with Docker and the NVIDIA Container Toolkit in WSL.”

Given this, does it mean I will be able to use Docker images with different versions of CUDA and Tensorflow installed only in the Docker container and not in WSL like I do on Linux?

This is terrific news, windows rules again, I was getting worried.

@ddaspit
Copy link

ddaspit commented May 21, 2020

In Windows, the full GPU memory is not available to a process, unlike Linux which has full access to the memory. In WSL, will processes have full access to GPU memory like normal Linux?

@therealkenc
Copy link
Collaborator

/fixed 20150

@ghost ghost closed this as completed Aug 7, 2020
@ghost ghost added the fixedininsiderbuilds label Aug 7, 2020
@ghost
Copy link

ghost commented Aug 7, 2020

This bug or feature request originally submitted has been addressed in whole or in part. Related or ongoing bug or feature gaps should be opened as a new issue submission if one does not already exist.

Thank you!

@banderlog
Copy link

OK, but when it will be available outside of insiders builds?

@onacrame
Copy link

OK, but when it will be available outside of insiders builds?

Also would like to know the timetable for release.

@banderlog
Copy link

It seems that it is not exclusive for insiders builds anymore:

@blackliner
Copy link

@banderlog how do you come to that conclusion? Link 1 is just about getting WSL2 installed, which yes, does not need insiders. But link 2 still states inseders as required:
image

@banderlog
Copy link

@blackliner alas, WSL2 starts from Build 18362 or higher, while CUDA wants 20145 or higher, which is still insider build.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests