You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/data/demo/MFTCoder/mftcoder_accelerate/src/pefts/merge_base_and_lora_to_hf.py", line 82, in
model_to_merge = PeftModel.from_pretrained(base_model, lora_adapter)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/peft_model.py", line 355, in from_pretrained
model = MODEL_TYPE_TO_PEFT_MODEL_MAPPING[config.task_type](model, config, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/peft_model.py", line 1094, in init
super().init(model, peft_config, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/peft_model.py", line 129, in init
self.base_model = cls(model, {adapter_name: peft_config}, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/tuners/lora/model.py", line 136, in init
super().init(model, config, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 148, in init
self.inject_adapter(self.model, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 328, in inject_adapter
raise ValueError(
ValueError: Target modules {'c_proj', 'w1', 'c_attn', 'w2'} not found in the base model. Please check the target modules and try again.
The text was updated successfully, but these errors were encountered:
This issue may not be related to which MFT loss was used. It's possible that the problem stems from an incorrect setting of the model_type (qwen or qwen2). Could you please confirm whether the model_type used during training and merging is consistent?
It's important to note that the 'qwen' model_type cannot be used to load models based on Qwen2 or later versions.
Traceback (most recent call last):
File "/data/demo/MFTCoder/mftcoder_accelerate/src/pefts/merge_base_and_lora_to_hf.py", line 82, in
model_to_merge = PeftModel.from_pretrained(base_model, lora_adapter)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/peft_model.py", line 355, in from_pretrained
model = MODEL_TYPE_TO_PEFT_MODEL_MAPPING[config.task_type](model, config, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/peft_model.py", line 1094, in init
super().init(model, peft_config, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/peft_model.py", line 129, in init
self.base_model = cls(model, {adapter_name: peft_config}, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/tuners/lora/model.py", line 136, in init
super().init(model, config, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 148, in init
self.inject_adapter(self.model, adapter_name)
File "/root/miniconda3/envs/mft/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 328, in inject_adapter
raise ValueError(
ValueError: Target modules {'c_proj', 'w1', 'c_attn', 'w2'} not found in the base model. Please check the target modules and try again.
The text was updated successfully, but these errors were encountered: