forked from bitsandbytes-foundation/bitsandbytes
-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Labels
Description
System Info
Kernel: 6.5.0-28-generic
Distributor ID: Ubuntu
Description: Ubuntu 22.04.4 LTS
Release: 22.04
Codename: jammy
GPU: Sapphire Pulse RX 7900 XTX
ROCm Version: 6.0.2
CPU: Ryzen 7 7700X
Motherboard: Gigabyte Aorus Elite AX B650 (BIOS: F24c)
Torch version: torch==2.3.0+rocm6.0
Python version: 3.10.14
Reproduction
I'm on the rocm_enabled branch. Attempting to compile the ROCm 6.2 testing branch results in errors. Running the following code results in this error:
# Huggingface Transformers
model_id = "microsoft/Phi-3-mini-128k-instruct"
bnb_config = BitsAndBytesConfig(
load_in_8bit=True,
#load_in_4bit=True,
#bnb_4bit_quant_type="nf4",
#bnb_4bit_compute_dtype=torch.float16
)
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.bfloat16,
device_map="cuda",
trust_remote_code = True,
attn_implementation="eager",
quantization_config = bnb_config # un-comment to quantize your model; Only supports Nvidia GPUs
)attached here is ops.hip:
ops.hip.zip
Expected behavior
after running that piece of code, I get the following error:
Error invalid device function at line 679 in file /home/$USER/bitsandbytes/csrc/ops.hip.
Nothing else prints to my terminal.
