-
Notifications
You must be signed in to change notification settings - Fork 595
Description
Prerequisites
- I have read the documentation.I have checked other issues for similar problems.To pick up a draggable item, press the space bar. While dragging, use the arrow keys to move the item. Press space again to drop the item in its new position, or press escape to cancel.
Backend
Local
Interface Used
CLI
CLI Command
popen_command = [
"autotrain",
"llm",
"--train",
"--model",
config["model_name"],
"--data-path",
data_directory,
"--lr",
learning_rate,
"--batch-size",
batch_size,
"--epochs",
num_train_epochs,
"--trainer",
"sft",
"--peft",
"--merge-adapter",
"--auto_find_batch_size", # automatically find optimal batch size
"--project-name",
project_name,
]
UI Screenshots & Parameters
No response
Error Logs
Traceback (most recent call last):
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/autotrain/trainers/common.py", line 212, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/autotrain/trainers/clm/main.py", line 28, in train
train_sft(config)
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/autotrain/trainers/clm/train_clm_sft.py", line 27, in train
model = utils.get_model(config, tokenizer)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/autotrain/trainers/clm/utils.py", line 943, in get_model
model = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/transformers/modeling_utils.py", line 3620, in from_pretrained
hf_quantizer.validate_environment(
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/transformers/quantizers/quantizer_bnb_4bit.py", line 79, in validate_environment
from ..integrations import validate_bnb_backend_availability
File "", line 1229, in _handle_fromlist
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1805, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/deep/.transformerlab/envs/transformerlab/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1819, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.integrations.bitsandbytes because of the following error (look up to see its traceback):
No module named 'triton.ops'
Additional Information
This is caused by bitsandbytes
version being 0.45.0. Upgrading to 0.45.5
fixes it
Activity
github-actions commentedon May 16, 2025
This issue is stale because it has been open for 30 days with no activity.
github-actions commentedon Jun 5, 2025
This issue was closed because it has been inactive for 20 days since being marked as stale.
ghazal2326 commentedon Jun 10, 2025
for me issue has been closed