Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Finetune Support #814

Open
alex4321 opened this issue Oct 11, 2023 · 3 comments
Open

Add Finetune Support #814

alex4321 opened this issue Oct 11, 2023 · 3 comments
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@alex4321
Copy link

In the late versions of llama.cpp there are ways to fine-tune the model using LoRA adapters: ggerganov/llama.cpp#2632

However, either I missed it or are there no functions introduced in this PR imported into this project?

Describe alternatives you've considered
Sure it's always an option to use a CLI tool, but I guess Python binding can be useful.

@abetlen
Copy link
Owner

abetlen commented Oct 11, 2023

@alex4321 not yet but I am planning to work on this. The first step is actually exposing the training API in llama.cpp as it's currently only available as an example which requires working with the GGML API directly.

antoine-lizee pushed a commit to antoine-lizee/llama-cpp-python that referenced this issue Oct 30, 2023
@abetlen abetlen changed the title Is there any way to use finetune functions now? Add Finetune Support Nov 8, 2023
@abetlen abetlen added the enhancement New feature or request label Nov 8, 2023
@satyaloka93
Copy link

Hi, will there be a way to switch lora adapters, like PEFT enable/disable feature?

@abetlen abetlen added the help wanted Extra attention is needed label Nov 21, 2023
@nai-kon
Copy link

nai-kon commented Mar 25, 2024

Hi, will there be a way to switch lora adapters, like PEFT enable/disable feature?

I want to know it too. It's like load_adapter/set_adapter of transformers library.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants