Skip to content
This repository has been archived by the owner on May 14, 2024. It is now read-only.

The cuda problem is not being solved #321

Open
ieya114 opened this issue Dec 15, 2023 · 6 comments
Open

The cuda problem is not being solved #321

ieya114 opened this issue Dec 15, 2023 · 6 comments

Comments

@ieya114
Copy link

ieya114 commented Dec 15, 2023

CUDA backend failed to initialize: Found cuDNN version 8700, but JAX was built against version 8904, which is newer. The copy of cuDNN that is installed must be at least as new as the version against which JAX was built. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)

Nothing is working with this error. How can I solve this problem?

@ClaraSanders
Copy link

I think it is related to some Google update. I use Automatic1111 from The Last Ben and also stopped working for the same reason.

@chrsgcknhmr
Copy link

I ran TheLastBens sd auto1111 colab and also since Friday I get the following error message:

WARNING[XFORMERS]:
xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.1.0+cu118 with CUDA 1106 (you have 2.1.0+cu121)
Python 3.9.16 (you have 3.10.12) Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details

Is it within Colab/ notebook version issues or is it in someway connected to my local versions? Because which python --version says I'm running on 3.12.

@ClaraSanders
Copy link

Yeah, as I said above almost every colab related to SD is broken due to Google latest update. There is a temporary fix, I don't know if it will work for Kohya. Check out my topic on The Last Ben's discussion group.

@Linaqruf
Copy link
Owner

The cuda fix is pushed, it may still showing cuda is not initialized or something but I finished a LoRA training without error, the problem actually in bitsandbytes version.

@PlumButa
Copy link

The cuda fix is pushed, it may still showing cuda is not initialized or something but I finished a LoRA training without error, the problem actually in bitsandbytes version.

It can works, thanks for your efforts.
Your trainer has the best training quality among other Colab notes.

@xjdeng
Copy link

xjdeng commented Dec 21, 2023

I'm able to train using DAdaption but not Adam8bit as a result of this.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants