We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
- OS: Windows - Continue: v0.6.18
LM Studio url appends /v1/ to url, resulting in:
Error visible in LM Studio logs:
[2023-12-28 14:58:31.771] [ERROR] Unexpected endpoint or method. (POST /v1/v1/chat/completions). Returning 200 anyway
Standard Config:
{ "title": "Mistral", "model": "dolphin-2.6-mistral-7b", "contextLength": 4096, "provider": "lmstudio" }
Fixed by manually adding apiBase setting in config:
apiBase
{ "title": "Mistral", "model": "dolphin-2.6-mistral-7b", "contextLength": 4096, "provider": "lmstudio", "apiBase": "http://localhost:1234" }
The code set here hard codes the v1: https://github.com/continuedev/continue/blame/b162e8703a357c8d365cd6073bbc7fbb58ad527f/core/llm/llms/LMStudio.ts#L7
v1
No response
The text was updated successfully, but these errors were encountered:
Thanks for the report, I fixed it in two ways in this commit, which I will publish a version for later tonight
Sorry, something went wrong.
No branches or pull requests
Before submitting your bug report
Relevant environment info
Description
LM Studio url appends /v1/ to url, resulting in:
Error visible in LM Studio logs:
Standard Config:
Fixed by manually adding
apiBase
setting in config:The code set here hard codes the
v1
:https://github.com/continuedev/continue/blame/b162e8703a357c8d365cd6073bbc7fbb58ad527f/core/llm/llms/LMStudio.ts#L7
To reproduce
No response
Log output
No response
The text was updated successfully, but these errors were encountered: