Skip to content

[bug]: ROCm docker container doesn't work #8590

@ai-mind

Description

@ai-mind

Is there an existing issue for this problem?

  • I have searched the existing issues

Install method

Manual

Operating system

Linux

GPU vendor

AMD (ROCm)

GPU model

RX 7900 GRE

GPU VRAM

16gb gddr6

Version number

6.8

Browser

No response

System Information

No response

What happened

Bitsandbytes compiled incorrectly. I tried pulling latest invokeai:main-rocm, invokeai:6.4-rocm, and build the image manually. All methods return the same.

bitsandbytes library load error: Configured CUDA binary not found at /opt/venv/lib/python3.12/site-packages/bitsandbytes/libbitsandbytes_rocm63.so
Traceback (most recent call last):
File "/opt/venv/lib/python3.12/site-packages/bitsandbytes/cextension.py", line 290, in
lib = get_native_library()
^^^^^^^^^^^^^^^^^^^^
File "/opt/venv/lib/python3.12/site-packages/bitsandbytes/cextension.py", line 270, in get_native_library
raise RuntimeError(f"Configured CUDA binary not found at {cuda_binary_path}")
RuntimeError: Configured CUDA binary not found at /opt/venv/lib/python3.12/site-packages/bitsandbytes/libbitsandbytes_rocm63.so

What you expected to happen

Expect quantization to work

How to reproduce the problem

No response

Additional context

No response

Discord username

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions