Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: move /completions under /v1/completions #112

Merged
merged 2 commits into from
Sep 20, 2023

Conversation

tomasmcm
Copy link
Contributor

This PR updates the server endpoints in local.ai to match the endpoints from llama-cpp-python.
Specifically it changes /completions to /v1/completions as in llama-cpp-python/llama_cpp/server/app.py:599
And /model to /v1/models as in llama-cpp-python/llama_cpp/server/app.py:805

@vercel
Copy link

vercel bot commented Sep 20, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
local-ai-web ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 20, 2023 8:18pm

@tomasmcm
Copy link
Contributor Author

Can confirm this works 👌 with Continue.dev using GGML(server_url="http://localhost:8080") which is amazing!

Copy link
Owner

@louisgv louisgv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! 👍 - this is somewhat related to #42 , but we can ponder on that and just get this more compatible with other API.

We will need to fix the wiki next: https://github.com/louisgv/local.ai/wiki

Side-note: I wonder if we can add both v1 and also the base API? (So that when we update to v2, we can point base to the latest version :-?...

@tomasmcm can you investigate if that's possible?

@louisgv
Copy link
Owner

louisgv commented Sep 20, 2023

Actually... would be much better if we exposes the baseURL as a config instead - so that folks can put /v1 or /? (Similar to how we have the config for port etc...)

EDIT: scratch this idea - it's basically #42

@louisgv louisgv changed the title fix: llama-cpp-python compatible server feat: move /completions under /v1/completions Sep 20, 2023
@tomasmcm
Copy link
Contributor Author

@louisgv agreed, one step at a time. Having a compatible API is essential for many use cases. Making it customisable is even better but that can be done next 😊
Feel free to merge this (I don't have write access to do it)

@louisgv louisgv merged commit 8b1d9a2 into louisgv:main Sep 20, 2023
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants