-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ROCm & Windows Support #106608
Comments
Windows Support for PyTorch is currently not available. This includes WSL2 based workflows. |
Is there any progress? |
Nope, waiting for pytorch official’s support.
Loong, send by Mail App<https://krs.microsoft.com/redirect?id=P5Ie3-7G> of Windows 11 Edition
发件人: ***@***.***>
发送时间: 2023年8月18日 15:04
收件人: ***@***.***>
抄送: ***@***.***>; ***@***.***>
主题: Re: [pytorch/pytorch] ROCm & Windows Support (Issue #106608)
Is there any progress?
―
Reply to this email directly, view it on GitHub<#106608 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AQSYOXYHIH35Z2ATNZ4RY2LXV4HXLANCNFSM6AAAAAA3D5M4AY>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Waiting... |
They MIOPS and other things to be implemented on windows AMD to implement pytorch @rtccreator |
and now? :) |
Why does AMD work on Linux but not win |
What's the current state of this? |
It is said that AMD was focused on linux arc (ex:PS3, PS4) years ago so their tech documents on WINDOWS were very poor. |
Any updates on this? I recently moved from cuda to rocm and I'm not liking the experience one bit on linux. Ubuntu just hangs completely if my code is about to go into OOM error even on an RX 7900 XTX. My rtx3070 did better than this on windows. Was easily able to scale using the shared gpu memory on windows for training. Now my pytorch transformer is crashing with even 320M parameters on fp32. |
Which Ubuntu version? Native Linux or WSL2? |
Native Linux |
I hope someone can help you figure it out. ROCm 6 will apparently be released this month; hopefully, Radeon support will be more mature. I'm still hesitant to get a RX 7900 XTX. The experiences over the past 10+ years I had with AMD's promises when it came to their software stack taught me to be careful. 😏 |
Yeah, I would say please hold on to the thought of getting AMD just yet though don't get Nvidia either. AMD needs to get their act together and the community needs to start pushing more for AMD until they become like Nvidia and then we will support Intel. All said and done, Pytorch should also pitch in to get the libraries on all operating systems so that the general peeps like me can keep experimenting and figuring out where all things can be improved. |
Obviously AMD is arrogant and ignore Windows users. Intel already supports Windows.I don't have any good feelings towards AMD now :) |
I honestly hope to god somebody in the higher management in AMD sees your comment and gets the common market sentiment against them |
@kjhanjee |
Now I have completely surrendered to Nvidia. I plan to sell my 7900xtx soon. Because of Black Friday, GTX4060Ti 16G is enough. It is cheap, has large memory, and even has a complete CUDA and a complete ecosystem. |
Great thing man it will help out a lot of people. I was able to set up ROCm with the 7900xtx. And now I am working on deep learning using ubuntu. This is where it gets tricky, the os hangs completely or crashes gpu drivers if it is about to get into OOM. |
This is not our problem, cheap and paper performance makes me no longer believe in AMD. It makes me waste too much time. To be honest, I suggest you try 4060Ti 16G. It allows you to play Apex and run/train ai at the same time. |
For those like myself that have struggled to get AMD inference running locally on Windows, this may be your lucky day. Below is the git for the new transformers release. https://github.com/huggingface/transformers/releases/tag/v4.36.0?utm_source=tldrai |
I hope you wouldn't. I know switching to the winning side might be great but for the longer run, and at least in the ai world, pushing for the underdog might just work out. |
Actually, u can use environment variable changing to make it become gfx1030 |
@Looong01 L https://github.com/saharNooby/rwkv.cpp/blob/master/.github/workflows/build.yml#L189
His gpu doesn't work. Maybe I set it up wrong. Can you give me some advice. |
ohhh,Does this mean that pytorch supports hip? |
In ubuntu you can '''export HSA_OVERRIDE_GFX_VERSION=10.3.0'''. |
Thx |
ROCm 6.0.0 is out, with only two Radeon cards supported under Linux, the RX 7900 XTX and the VII: |
Windows rocm can run stable diffusion now:https://github.com/leejet/stable-diffusion.cpp |
how to use it? |
Any news regarding pytorch? anyway I also dont understand why linux support be that lame. (less HW than with windows as I checked.) |
Sucks. Going to run a Linux distribution to get this to work. if anyone has recommendations on distributions. I have an RX 7900 XTX probably looking at Ubuntu |
Ubuntu has the broadest support these days. I would give that a shot first. |
rocm6.1 will be compatible with windows, so now you can compile MiOpen and AMDMIGraphX for windows |
I already tried that with Ubuntu 22.04 but had to give up due to some very basic memory leakages. My 7900xtx was getting ooms even with <1B scale models |
I will try both Linux Mint/Kubuntu and see how that goes. I went ahead and grabbed an extra m.2 drive just to play around with linux. |
I am going for Kubuntu 22.04 LTS and see how it goes |
Getting |
Whats the status here? Anything that is stopping development on getting rocm to work with pytorch on Windows? |
+1 looking forward to ROCm 6/windows support with PyTorch 👍 - Any way we can help? |
+1 I was trying to get pytorch on wsl w/rocm working before I realized wsl doesn't support and gpus correctly to use rocm. It would be great if the pieces have all fallen in place for native windows pytorch to support rocm. |
+1 looking forward to ROCm 6/windows support with PyTorch 👍 |
Something new? |
Excuse me, does anyone know if I can use ROCm in WSL2? for pytorch |
still waiting... |
Not as of now, AMD drivers aren't compatible with WSL2 to make GPU available properly to the linux kernel |
It might not happen anytime soon. The problem is on AMD's end where they have to bring MIOpen to windows first and then it will be possible to create a pytorch source that can be built out for ROCm and Windows |
I'm optimistic that issues like that will be solved by 2034. |
I'm from 2069 and ROCm is still not available in Pytorch for Windows. |
🚀 The feature, motivation and pitch
AMD has release ROCm windows support, as docs.amd.com shows:
Please add PyTorch support of Windows on AMD GPUs!
Alternatives
No response
Additional context
No response
cc @jeffdaily @sunway513 @jithunnair-amd @pruthvistony @ROCmSupport @dllehr-amd @jataylo @hongxiayang
The text was updated successfully, but these errors were encountered: