You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File [~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:463], in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
461 elif type(config) in cls._model_mapping.keys():
462 model_class = _get_model_class(config, cls._model_mapping)
--> 463 return model_class.from_pretrained(
464 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
465 )
466 raise ValueError(
467 f"Unrecognized configuration class {config.class} for this kind of AutoModel: {cls.name}.\n"
468 f"Model type should be one of {', '.join(c.name for c in cls._model_mapping.keys())}."
469 )
File [~/.local/lib/python3.10/site-packages/transformers/modeling_utils.py:2406], in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
...
-> 2406 dispatch_model(model, device_map=device_map, offload_dir=offload_folder, offload_index=offload_index)
2408 if output_loading_info:
2409 if loading_info is None:
TypeError: dispatch_model() got an unexpected keyword argument 'offload_index'
The text was updated successfully, but these errors were encountered:
Hi @colorzhang, thank you for reporting. What version of accelerate and transformers are you using? Maybe updating to the latest versions will address this issue?
Using
stabilityai[/stablelm-base-alpha-7b]
Loading with:
torch_dtype='float16', load_in_8bit=False, device_map='auto'
TypeError Traceback (most recent call last)
Cell In[10], line 17
14 cprint(f"Loading with:
{torch_dtype=}, {load_in_8bit=}, {device_map=}
")16 tokenizer = AutoTokenizer.from_pretrained(model_name)
---> 17 model = AutoModelForCausalLM.from_pretrained(
18 model_name,
19 torch_dtype=getattr(torch, torch_dtype),
20 load_in_8bit=load_in_8bit,
21 device_map=device_map,
22 offload_folder="[./offload],
23 )
File [~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:463], in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
461 elif type(config) in cls._model_mapping.keys():
462 model_class = _get_model_class(config, cls._model_mapping)
--> 463 return model_class.from_pretrained(
464 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
465 )
466 raise ValueError(
467 f"Unrecognized configuration class {config.class} for this kind of AutoModel: {cls.name}.\n"
468 f"Model type should be one of {', '.join(c.name for c in cls._model_mapping.keys())}."
469 )
File [~/.local/lib/python3.10/site-packages/transformers/modeling_utils.py:2406], in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
...
-> 2406 dispatch_model(model, device_map=device_map, offload_dir=offload_folder, offload_index=offload_index)
2408 if output_loading_info:
2409 if loading_info is None:
TypeError: dispatch_model() got an unexpected keyword argument 'offload_index'
The text was updated successfully, but these errors were encountered: