Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Running LibreChat against LiteLLM backed by Ollama #1270

Closed
1 task done
netnem opened this issue Dec 3, 2023 · 4 comments · Fixed by #1278
Closed
1 task done

[Bug]: Running LibreChat against LiteLLM backed by Ollama #1270

netnem opened this issue Dec 3, 2023 · 4 comments · Fixed by #1278
Assignees
Labels
bug Something isn't working

Comments

@netnem
Copy link

netnem commented Dec 3, 2023

Contact Details

No response

What happened?

I have Ollama/openchat running behind the OpenAPI compatible frontend of LiteLLM.

The chat completion never "finishes" when the bot is responding, appears to be searching for a chatcompletion message

Steps to Reproduce

install librechat via docker,
in .env file set OPENAI_REVERSE_PROXY=http://192.168.2.142:8000 (i installed litellm on the docker host)
install ollama via install script: curl https://ollama.ai/install.sh | sh
pip install litellm
ollama pull openchat
pip install async_generator
litellm --model ollama/openchat --api_base http://localhost:11434 --drop_params <-- required for presence penalty not being supported in litellm

The setup works for calling a localLLM, but the cursor gets "stuck" and does not return to the user to add additional chats.

What browsers are you seeing the problem on?

Firefox, Microsoft Edge

Relevant log output

There was an uncaught error:
OpenAIError: stream ended without producing a ChatCompletionMessage with role=assistant
    at ChatCompletionStream._AbstractChatCompletionRunner_getFinalMessage (/app/node_modules/openai/lib/AbstractChatCompletionRunner.js:464:11)
    at ChatCompletionStream._AbstractChatCompletionRunner_getFinalContent (/app/node_modules/openai/lib/AbstractChatCompletionRunner.js:455:134)
    at ChatCompletionStream._emitFinal (/app/node_modules/openai/lib/AbstractChatCompletionRunner.js:282:152)
    at /app/node_modules/openai/lib/AbstractChatCompletionRunner.js:77:22
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
Clearing sync timeouts before exiting...

Screenshots

image

Code of Conduct

  • I agree to follow this project's Code of Conduct
@netnem netnem added the bug Something isn't working label Dec 3, 2023
@danny-avila
Copy link
Owner

Thanks for your thorough report, hope to address this soon as I've seen this error in another context, and I can reproduce it now.

@danny-avila danny-avila self-assigned this Dec 3, 2023
@danny-avila
Copy link
Owner

I'm fixing this right now

@danny-avila
Copy link
Owner

FYI I seem to have issues with ollama independent of LibreChat when I don't include --drop_params, related issue: BerriAI/litellm#992 (comment)

@netnem
Copy link
Author

netnem commented Dec 5, 2023

Confirmed the latest merge works great.

Thanks @danny-avila

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: ✅ Done
2 participants