-
Notifications
You must be signed in to change notification settings - Fork 31.7k
Description
System Info
transformers==4.52.0.dev0 (Top of the Trunk)
Platform: Rocky Linux
Traceback (most recent call last):
File ".local/lib/python3.9/site-packages/peft/tuners/lora/model.py", line 359, in getattr
return super().getattr(name) # defer to nn.Module's logic
File ".local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1928, in getattr
raise AttributeError(
AttributeError: 'LoraModel' object has no attribute 'prepare_inputs_for_generation'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "nn/phiexp/phi4v1.py", line 9, in
model = AutoModelForCausalLM.from_pretrained(
File ".local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File ".local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 303, in _wrapper
return func(*args, **kwargs)
File ".local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 4504, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File ".cache/huggingface/modules/transformers_modules/microsoft/Phi-4-multimodal-instruct/33e62acdd07cd7d6635badd529aa0a3467bb9c6a/modeling_phi4mm.py", line 1962, in init
peft_model = get_peft_model(self.model, vision_lora_config, adapter_name="vision")
File ".local/lib/python3.9/site-packages/peft/mapping_func.py", line 123, in get_peft_model
return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](
File ".local/lib/python3.9/site-packages/peft/peft_model.py", line 1723, in init
self.base_model_prepare_inputs_for_generation = self.base_model.prepare_inputs_for_generation
File ".local/lib/python3.9/site-packages/peft/tuners/lora/model.py", line 363, in getattr
return getattr(self.model, name)
File ".local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1928, in getattr
raise AttributeError(
AttributeError: 'Phi4MMModel' object has no attribute 'prepare_inputs_for_generation'
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
Steps to reproduce:
- Install transformers top of the trunk and related components
- Now run any official script
- Issue will be reproduced (model load will fail with above mentioned trace).
Expected behavior
It should be loaded without any issue.