Skip to content
This repository has been archived by the owner on Sep 7, 2022. It is now read-only.

Script for AMD? #272

Closed
Apoc9512 opened this issue Sep 5, 2022 · 7 comments
Closed

Script for AMD? #272

Apoc9512 opened this issue Sep 5, 2022 · 7 comments

Comments

@Apoc9512
Copy link

Apoc9512 commented Sep 5, 2022

I followed the instructions, yet ran into the issue of not finding a CUDA/NVIDIA GPU. Is there something I can modify or some fork of the script that will work on AMD cards?

@cstueckrath
Copy link

cstueckrath commented Sep 5, 2022 via email

@SJDunkelman
Copy link

I've installed non-CUDA using pip within conda, but it's still presenting same issue - how do I get webui.cmd to use the pip installed package?

@cstueckrath
Copy link

try calling device = torch.device('cuda')

What happens then? This should work with ROCm out of the box (see pytorch/pytorch#10670)

@phreeware
Copy link

phreeware commented Sep 5, 2022

read here: https://pytorch.org/get-started/locally/ -> rocm (you'll have to use pip inside of anaconda: https://stackoverflow.com/questions/41060382/using-pip-to-install-packages-to-anaconda-environment ) and here: https://rocmdocs.amd.com/en/latest/ Am Mo., 5. Sept. 2022 um 06:54 Uhr schrieb Apoc9512 < @.>:

I followed the instructions, yet ran into the issue of not finding a CUDA/NVIDIA GPU. Is there something I can modify or some fork of the script that will work on AMD cards? — Reply to this email directly, view it on GitHub <#272>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAXQDPO7EDC4VE74G7GM7FDV4V4HVANCNFSM6AAAAAAQEUJUA4 . You are receiving this because you are subscribed to this thread.Message ID: @.
>

so this will only work on unix since ROCm is unix only? any way to get around this? would WSL be working (https://docs.microsoft.com/en-us/windows/wsl/install)?

@kik4444
Copy link

kik4444 commented Sep 6, 2022

Is any effort being made somewhere towards creating a Docker setup for AMD + the webui, either on this repo or a fork somewhere? I'd like to try running this on my Linux PC, but I get put off when I see how much tweaking I'll have to do with little help online compared to Nvidia. I'm confident in my knowledge of Linux and Python, but my next-to-nonexistent knowledge of A.I. makes me think I'll mess something up.

@cstueckrath
Copy link

WSL won't work this way. Youll have to install a Linux distribution (don't use Ubuntu 22.04, only 20.04 is supported right now).
You need to have a supported GPU, too! A NAVI 22 (eg Radeon RX6700XT) is NOT supported by ROCm!

It might be possible to work around all this with manually patching and even get it to work in WSL2 with help from Microsoft:
https://github.com/microsoft/antares

But this is all too much for me to dig around alone... I hope, someone can find a solution that works.

@cstueckrath
Copy link

you can try the stuff I wrote here: #214 (comment)

I cannot test this myself because of my gpu

@hlky hlky closed this as completed Sep 6, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants