-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Other gpu support #26
Comments
if you have the K80 and want to add it to the supported list, run :
after around 40min, and the installation is done, navigate to /usr/local/lib/python3.7/dist-packages/xformers save the two files : "_C_flashattention.so" and "_C.so", upload them to any host and send me the link and I will integrate them in the Colab for K80 users. the files might not show in the colab explorer, so you will have to rename them
|
Hi. I am using collab pro and very often I have a100 gpu assigned. How can you add to the gpu ? |
In a few hours it will be added to the colabs |
Perhaps this is the right issue. Running on A100 (colab), getting spammed in the output:
|
did you make a clean run with an update colab from the repo ? |
Not sure, will try it now and report back. ... or perhaps not now, but when I get A100 again 🤷♂️ |
@TheLastBen , actually, that doesn't seem to be A100-specific issue. On V100 I see:
Running freshly opened: Is it a regression? Will try to reproduce on clean UPD: confirmed, clean |
try to reproduce the error with a T4 (free colab) |
also try other colabs if the same issue happens |
I did this, but am using conda, so the directories don't match up. I found I did find |
For the K80 ? |
I'll let you know if I can get the files from a K80 colab. Right now all my accounts have a usage limit of no GPU, or are getting T4s, though, so it might be a while. |
@TheLastBen I was able to install xformers with a bigger |
GPUs unsupported by flash attention don't produce a |
Ok, thanks. Quick question: what is |
The C/C++/Cuda code responsible for the xformers-specific operations (memory efficient attention included) for the underlying machine (python version, cuda, ..) |
@TheLastBen Would you be able to make the whl using that file and add it? I'd make the whl and do a new PR, but it's not working for me. |
Sorry, I completely forgot about it, I'll add it as soon as I'm done with the new Dreambooth method |
Just added, if you get the K80 try it in A1111 Colab and let me know if the wheel works. |
Hello, from the notebook I can see that pre-compiled are available only for
T4
,P100
andV100
gpus:What. about other gpu like K80?
Thanks.
The text was updated successfully, but these errors were encountered: