Add streaming prompt support for GPT providers (OpenAI, Gemini, Ollama, Anthropic) #1645
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Motivation
$rest(...).get2Stream/post2Streamplumbing and local stream helpers to parse SSE/NDJSON payloads and integrate with current conversation/tool-calling flow.Description
_readSseStreamand provider-specific_requestStreamfunctions that call$rest(...).get2Stream/post2StreamwithAccept: text/event-streamwhere appropriate.rawPromptStreamandpromptStreammethods to each provider implementation (OpenAI, Gemini, Ollama, Anthropic) to parse streamed payloads, aggregatecontentandevents, surface tool-call deltas, and capture stats.ow.ai.gpt.prototype.promptStream,ow.ai.gpt.prototype.rawPromptStream, and$gpt.promptStream/$gpt.rawPromptStreamwrappers.Testing
rawPromptStream/promptStream(for example via$gpt({...}).promptStream(...)), or run the repository test suite withcd tests && ojob autoTestAll.yamlas recommended by project guidelines.