Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama.cpp: Load Loras adapters #919

Closed
mudler opened this issue Aug 17, 2023 · 1 comment · Fixed by #955
Closed

llama.cpp: Load Loras adapters #919

mudler opened this issue Aug 17, 2023 · 1 comment · Fixed by #955
Assignees
Labels
enhancement New feature or request high prio

Comments

@mudler
Copy link
Owner

mudler commented Aug 17, 2023

This is to track addition of lora adapters to the llama.cpp backend

@mudler mudler added enhancement New feature or request high prio labels Aug 17, 2023
@mudler mudler self-assigned this Aug 17, 2023
@localai-bot
Copy link
Contributor

⚠️⚠️⚠️⚠️⚠️

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me,
I don't support (yet) replying!

⚠️⚠️⚠️⚠️⚠️

I'm sorry, I don't understand what you mean by "lora adapters". Can you please provide more context or clarify the issue?

Sources:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request high prio
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants