Skip to content

Conversation

logicalor
Copy link

Q A
Bug fix? yes
New feature? yes
Docs? yes/no
Issues Fix #338
License MIT

Both a bug fix and a feature? Support streaming output from Ollama, preventing an error being thrown if streaming output is sent with the chat request.

@carsonbot carsonbot added Bug Something isn't working Feature New feature Status: Needs Review labels Aug 21, 2025
@carsonbot carsonbot changed the title 338 Support streaming output from Ollama 338 Support streaming output from Ollama Aug 21, 2025
@logicalor logicalor changed the title 338 Support streaming output from Ollama [Platform] Support streaming output from Ollama Aug 21, 2025
@OskarStark
Copy link
Contributor

Can you please add an example in examples/ directory? Thanks

@OskarStark OskarStark changed the title [Platform] Support streaming output from Ollama [Platform][Ollama] Support streaming output Aug 21, 2025
@OskarStark OskarStark added the Platform Issues & PRs about the AI Platform component label Aug 21, 2025
@logicalor
Copy link
Author

Can you please add an example in examples/ directory? Thanks

@OskarStark updated with an example

@OskarStark
Copy link
Contributor

Thanks

It looks like your committer email is not associated with your GitHub account

@logicalor
Copy link
Author

Thanks

It looks like your committer email is not associated with your GitHub account

@OskarStark thanks, hadn't noticed that. Added.

@logicalor logicalor requested a review from Guikingone August 23, 2025 05:26
@logicalor logicalor requested a review from Guikingone August 23, 2025 10:00
@chr-hertel
Copy link
Member

which ollama version do you use?
image

Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @logicalor for reporting and working on this - we need this :)

Made some comments, but i think all in all we're close to merging it 👍

Co-authored-by: Christopher Hertel <mail@christopher-hertel.de>
@logicalor
Copy link
Author

which ollama version do you use?

@chr-hertel I was running 0.10.1 but just upgraded to 0.11.7.

@logicalor logicalor requested a review from chr-hertel August 28, 2025 09:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working Feature New feature Platform Issues & PRs about the AI Platform component Status: Needs Review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Platform] Support streaming output for Ollama
5 participants