Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Working with ollama or llama.cpp #60

Closed
region23 opened this issue Aug 30, 2023 · 10 comments · Fixed by #129
Closed

Working with ollama or llama.cpp #60

region23 opened this issue Aug 30, 2023 · 10 comments · Fixed by #129

Comments

@region23
Copy link

With the publication of codellama, it became possible to run LLM on a local machine using ollama or llama.cpp.
How to configure your extension to work with local codellama?

@mishig25
Copy link
Collaborator

hello @region23

yes it is possible to use a local model. What you'd need to do is:

  1. Serve the model locally at some endpoint
  2. And change the settings accordingly

change it to your local endpoint
image

and make sure to update the prompt template
image

image

@region23
Copy link
Author

HF Code Error: code - 400; msg - Bad Request

Снимок экрана 2023-08-31 в 18 07 59

@region23
Copy link
Author

curl to API is working
Снимок экрана 2023-08-31 в 18 29 09

@McPatate
Copy link
Member

For now ollama's API is not supported, it's on the todo list though!

cf huggingface/llm-ls#17

@McPatate
Copy link
Member

Also created an issue for llama.cpp : huggingface/llm-ls#28

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Nov 11, 2023
@flaviodelgrosso
Copy link

+1

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Jan 17, 2024
@jefffortune
Copy link

Is there a timeline for when feat: Add adaptors for ollama and openai #117 might be merged?

@github-actions github-actions bot removed the stale label Feb 3, 2024
@McPatate
Copy link
Member

McPatate commented Feb 9, 2024

Finishing the last touches of fixes on llm-ls and testing everything works as expected for 0.5.0 and we should be good to go for a release. I'd say either I find some time this week-end or next week :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants