-
Notifications
You must be signed in to change notification settings - Fork 440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Torch not compiled with CUDA enabled #33
Comments
I am trying on the same... but something doesn't add up for me. this will give you the torch with cuda pip install torch==2.0.0+cu117 torchvision==0.15.0+cu117 torchaudio==2.0.0+cu117 -f https://download.pytorch.org/whl/cu117/torch_stable.html pip install transformers==4.25.1 That completes the setup BUT every time I run the generate my PC freezes and I need to hard reboot. I suspect 2 reasons, but can't confirm yet: Nvidia driver 555.85 and potentially Windows (because Triton, which is another GPU optimizer, that doesn't support Windows ) I will try on WSL. |
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh && bash Miniconda3-latest-Linux-x86_64.sh eval "$(/home/YOUR-USERNAME/miniconda3/bin/conda shell.bash hook)" && conda init && source ~/.bashrc conda create -n tooncrafter python=3.8.5 && conda activate tooncrafter sudo apt update && sudo apt install nvidia-cuda-toolkit git clone https://github.com/ToonCrafter/ToonCrafter.git && cd ToonCrafter pip install -r requirements.txt python gradio_app.py |
Also ... |
I dont know why but xformers dont work for me, so i kill it, and i see the ToonCrafter decode uses xformers and thats why i prefer to use the implementation for comfy ui. |
You need to install the GPU version of pytorch. This is a common issue with pytorch, and reinstalling pytorch on the gpu instead of the cpu will likely resolve this specific issue. Reading from https://pytorch.org/, you are on windows, so you can probably run (not tested):
You might need to find the version for torchvision as well, I don't remember which version was compatible with torch==2.0.0 on python 3.8. That being said, I think you'll have an easier time running the pipeline on linux (or wsl). |
I encountered the same bug while reinstalling xformers, which consistently conflicted with torch, torchaudio, and torchvision. To resolve this, I used the following command:
I also reinstalled torch, torchaudio, and torchvision, or they might have been installed during the xformers installation. I can't recall precisely. For reference, here is the final combination that worked for me (though it still has errors while generating videos, but at least it runs):
|
I am trying to test the tool but after running the installation process it does not work when I click on generate.
In the console an exception is raised on cuda initialization (Torch not compiled with CUDA enabled)
My card is a 3080 with 10GB.
Below I attach the full log after running gradio app
The text was updated successfully, but these errors were encountered: