Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama streaming output is garbled #1088

Closed
amir-ghasemi opened this issue Jan 10, 2024 · 3 comments
Closed

Ollama streaming output is garbled #1088

amir-ghasemi opened this issue Jan 10, 2024 · 3 comments

Comments

@amir-ghasemi
Copy link

amir-ghasemi commented Jan 10, 2024

Using local LLMs with Ollama produces outputs with words that are occasionally cut off or garbled. Using Ollama server outside the Chatbot-UI does not exhibit this issue.

with_streaming

I was able to fix this by adding a line to the code to disable the streaming.

const response = await fetchChatResponse( process.env.NEXT_PUBLIC_OLLAMA_URL + "/api/chat", { model: "mistral",//chatSettings.model, messages: formattedMessages, stream: false, options: { temperature: payload.chatSettings.temperature } }, false, newAbortController, setIsGenerating, setChatMessages )

Here is the output without streaming. It would be nice to be able to stream though.

no_streaming

@mckaywrigley
Copy link
Owner

Interesting... I suspect there's an error with how we're parsing the stream. Going to add this to the top of my bugs list cause this is a nasty one for sure.

@mgsotelo
Copy link

@mckaywrigley do you have any updates on this?, it is still happening for me

Screenshot 2024-01-15 at 13 01 20

@davidamacey
Copy link

Hello, I am having the same issue here:

System: Ubuntu 22.04, latest version of ollama and latest pull of chatbot-ui as of 4 pm EDT

Running prompt in ollama cli I get a different response than in the chatbot-ui.

Attached are text files for comparison of python code output.

Hopefully this is a simple fix!

chatbot-ui-output.txt
ollama-cli-output.txt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants