Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cuda_home enviroment variable is not set (i think it is set) #14

Open
wirytiox opened this issue Feb 25, 2024 · 2 comments
Open

Cuda_home enviroment variable is not set (i think it is set) #14

wirytiox opened this issue Feb 25, 2024 · 2 comments

Comments

@wirytiox
Copy link

Got this error, i had to do a lot of stuff to install everything
image
image
i followed a few tutorials in how to add it and not luck, is it my fault or is there an issue?

Traceback (most recent call last):
  File "C:\Users\juanj\Desktop\megumin\OneReality\OneRealityMemory.py", line 33, in <module>
    from exllamav2 import (
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\__init__.py", line 3, in <module>
    from exllamav2.model import ExLlamaV2
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\model.py", line 23, in <module>
    from exllamav2.config import ExLlamaV2Config
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\config.py", line 2, in <module>
    from exllamav2.fasttensors import STFile
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\fasttensors.py", line 5, in <module>
    from exllamav2.ext import exllamav2_ext as ext_c
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\ext.py", line 153, in <module>
    exllamav2_ext = load \
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 1306, in load
    return _jit_compile(
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 1710, in _jit_compile
    _write_ninja_file_and_build_library(
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 1800, in _write_ninja_file_and_build_library
    extra_ldflags = _prepare_ldflags(
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 1887, in _prepare_ldflags
    extra_ldflags.append(f'/LIBPATH:{_join_cuda_home("lib", "x64")}')
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 2407, in _join_cuda_home
    raise OSError('CUDA_HOME environment variable is not set. '
OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
@wirytiox
Copy link
Author

i tried this code:

import torch

def check_cuda():
    # Check if CUDA is available
    if torch.cuda.is_available():
        print("CUDA is available.")

        # Print CUDA version
        cuda_version = torch.version.cuda
        print(f"CUDA version: {cuda_version}")

        # Print GPU information
        gpu_count = torch.cuda.device_count()
        for i in range(gpu_count):
            gpu_name = torch.cuda.get_device_name(i)
            print(f"GPU {i + 1}: {gpu_name}")

    else:
        print("CUDA is not available.")

if __name__ == "__main__":
    check_cuda()

and i got CUDA is not available.

@wirytiox
Copy link
Author

it got fixed by installing the latest version of pytorch with it's respective cuda version, buy now i get this error:

C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\pyannote\audio\core\io.py:43: UserWarning: torchaudio._backend.set_audio_backend has been deprecated. With dispatcher enabled, this function is no-op. You can remove the function call.
  torchaudio.set_audio_backend("soundfile")
The torchaudio backend is switched to 'soundfile'. Note that 'sox_io' is not supported on Windows.
The torchaudio backend is switched to 'soundfile'. Note that 'sox_io' is not supported on Windows.
Traceback (most recent call last):
  File "C:\Users\juanj\Desktop\megumin\OneReality\OneRealityMemory.py", line 33, in <module>
    from exllamav2 import (
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\__init__.py", line 3, in <module>
    from exllamav2.model import ExLlamaV2
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\model.py", line 23, in <module>
    from exllamav2.config import ExLlamaV2Config
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\config.py", line 2, in <module>
    from exllamav2.fasttensors import STFile
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\fasttensors.py", line 5, in <module>
    from exllamav2.ext import exllamav2_ext as ext_c
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\exllamav2\ext.py", line 153, in <module>
    exllamav2_ext = load \
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 1306, in load
    return _jit_compile(
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 1736, in _jit_compile
    return _import_module_from_library(name, build_directory, is_python_module)
  File "C:\Users\juanj\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\cpp_extension.py", line 2132, in _import_module_from_library
    module = importlib.util.module_from_spec(spec)
ImportError: DLL load failed while importing exllamav2_ext: The specified module could not be found.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant