You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have Ollama/openchat running behind the OpenAPI compatible frontend of LiteLLM.
The chat completion never "finishes" when the bot is responding, appears to be searching for a chatcompletion message
Steps to Reproduce
install librechat via docker,
in .env file set OPENAI_REVERSE_PROXY=http://192.168.2.142:8000 (i installed litellm on the docker host)
install ollama via install script: curl https://ollama.ai/install.sh | sh
pip install litellm
ollama pull openchat
pip install async_generator
litellm --model ollama/openchat --api_base http://localhost:11434 --drop_params <-- required for presence penalty not being supported in litellm
The setup works for calling a localLLM, but the cursor gets "stuck" and does not return to the user to add additional chats.
What browsers are you seeing the problem on?
Firefox, Microsoft Edge
Relevant log output
There was an uncaught error:
OpenAIError: stream ended without producing a ChatCompletionMessage with role=assistant
at ChatCompletionStream._AbstractChatCompletionRunner_getFinalMessage (/app/node_modules/openai/lib/AbstractChatCompletionRunner.js:464:11)
at ChatCompletionStream._AbstractChatCompletionRunner_getFinalContent (/app/node_modules/openai/lib/AbstractChatCompletionRunner.js:455:134)
at ChatCompletionStream._emitFinal (/app/node_modules/openai/lib/AbstractChatCompletionRunner.js:282:152)
at /app/node_modules/openai/lib/AbstractChatCompletionRunner.js:77:22
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
Clearing sync timeouts before exiting...
Screenshots
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
Contact Details
No response
What happened?
I have Ollama/openchat running behind the OpenAPI compatible frontend of LiteLLM.
The chat completion never "finishes" when the bot is responding, appears to be searching for a chatcompletion message
Steps to Reproduce
install librechat via docker,
in .env file set OPENAI_REVERSE_PROXY=http://192.168.2.142:8000 (i installed litellm on the docker host)
install ollama via install script: curl https://ollama.ai/install.sh | sh
pip install litellm
ollama pull openchat
pip install async_generator
litellm --model ollama/openchat --api_base http://localhost:11434 --drop_params <-- required for presence penalty not being supported in litellm
The setup works for calling a localLLM, but the cursor gets "stuck" and does not return to the user to add additional chats.
What browsers are you seeing the problem on?
Firefox, Microsoft Edge
Relevant log output
Screenshots
Code of Conduct
The text was updated successfully, but these errors were encountered: