Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue merging LoRa adapters back into gptq quantized model #1287

Closed
RonanKMcGovern opened this issue Aug 16, 2023 · 1 comment
Closed

Issue merging LoRa adapters back into gptq quantized model #1287

RonanKMcGovern opened this issue Aug 16, 2023 · 1 comment

Comments

@RonanKMcGovern
Copy link

System Info

A PEFT model cannot be merged with the quantized model and pushed to hub...The error is:

Cannot merge LORA layers when the model is gptq quantized

after trying:

model_id = "ybelkada/llama-7b-GPTQ-test"
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto")

from peft import PeftModel

# load PEFT model with new adapters
model = PeftModel.from_pretrained(
    model,
    adapter_model_name,
)

model = model.merge_and_unload() # merge adapters with the base model.

Note that 'model' is the gptq quantized model, origin
I had originally posted in autogptq but realise this may be a question for here.



### Who can help?

@fxmarty perhaps you could assist? Maybe there is an alternate approach

### Information

- [ ] The official example scripts
- [X] My own modified scripts

### Tasks

- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)

### Reproduction (minimal, reproducible, runnable)

See above

### Expected behavior

Typically, this approach of merging (like when I do it with NF4 quantization) will work.
@RonanKMcGovern RonanKMcGovern added the bug Something isn't working label Aug 16, 2023
@SunMarc
Copy link
Member

SunMarc commented Aug 16, 2023

Hi @RonanKMcGovern, this is not a bug. We still haven't implemented the merge_and_unload() for gptq quantized model. Feel free to request this as a feature enhancement in peft library.

@SunMarc SunMarc removed the bug Something isn't working label Aug 16, 2023
@fxmarty fxmarty closed this as completed Aug 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants