Skip to content

ImportError: /home/myuser/.cache/torch_extensions/py310_cpu/exllama_ext/exllama_ext.so: undefined symbol: hipblasGetStream #154

@sjstulga

Description

@sjstulga

I am using exllama through the oobabooga text-generation-webui with AMD/ROCm. I cloned exllama into the text-generation-webui/repositories folder and installed dependencies.

Devices: 2x AMD Instinct MI60 gfx906
Distro: Ubuntu 20.04.6
Kernel: 5.15.0-76-generic
ROCm version 5.6.0
pytorch version 2.0.1 built from source

My command line

(textgen) myuser@mymachine:~/text-generation-webui$ python server.py --notebook --model airoboros-65B-gpt4-1.4-GPTQ --loader exllama --gpu-split 20,20 --listen --api

Output

2023-07-13 09:05:48 INFO:Loading airoboros-65B-gpt4-1.4-GPTQ...
2023-07-13 09:05:48 WARNING:Exllama module failed to load. Will attempt to load from repositories.
Successfully preprocessed all matching files.
2023-07-13 09:05:48 ERROR:Could not find repositories/exllama/. Make sure that exllama is cloned inside repositories/ and is up to date.
Traceback (most recent call last):
  File "/home/myuser/text-generation-webui/modules/exllama.py", line 10, in <module>
    from exllama.generator import ExLlamaGenerator
ModuleNotFoundError: No module named 'exllama'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/myuser/text-generation-webui/server.py", line 1157, in <module>
    shared.model, shared.tokenizer = load_model(shared.model_name)
  File "/home/myuser/text-generation-webui/modules/models.py", line 78, in load_model
    output = load_func_map[loader](model_name)
  File "/home/myuser/text-generation-webui/modules/models.py", line 296, in ExLlama_loader
    from modules.exllama import ExllamaModel
  File "/home/muser/text-generation-webui/modules/exllama.py", line 19, in <module>
    from generator import ExLlamaGenerator
  File "/home/myuser/text-generation-webui/repositories/exllama/generator.py", line 1, in <module>
    import cuda_ext
  File "/home/myuser/text-generation-webui/repositories/exllama/cuda_ext.py", line 43, in <module>
    exllama_ext = load(
  File "/home/myuser/pytorch/torch/utils/cpp_extension.py", line 1284, in load
    return _jit_compile(
  File "/home/myuser/pytorch/torch/utils/cpp_extension.py", line 1535, in _jit_compile
    return _import_module_from_library(name, build_directory, is_python_module)
  File "/home/myuser/pytorch/torch/utils/cpp_extension.py", line 1929, in _import_module_from_library
    module = importlib.util.module_from_spec(spec)
ImportError: /home/myuser/.cache/torch_extensions/py310_cpu/exllama_ext/exllama_ext.so: undefined symbol: hipblasGetStream

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions