-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Assistant stream mode and event callback #731
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #731 +/- ##
==========================================
+ Coverage 98.46% 98.75% +0.29%
==========================================
Files 24 25 +1
Lines 1364 1205 -159
==========================================
- Hits 1343 1190 -153
+ Misses 15 9 -6
Partials 6 6 ☔ View full report in Codecov by Sentry. |
Thank you for the PR! Sorry for a long review process :D |
stream.event = "message" // default event for chat completion stream | ||
ctx, cancel := context.WithCancel(context.Background()) | ||
stream.handlerCtx = ctx | ||
go func() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I don't think we're going to merge this as starting goroutines don't fit into the design of this library.
Please refer to the following comments:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy to have this in the examples/
folder, though!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No problem. What if I revise the parts where I used goroutines? Would that work better with the design of the library?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@CallOrRet I'm totally open to that!
Co-authored-by: Alexander Baranov <677093+sashabaranov@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are some commented code in this PR as well. But the bigger concern is the incomplete response model.
type AssistantThreadRunStreamResponse struct { | ||
ID string `json:"id"` | ||
Object string `json:"object"` | ||
Delta MessageDelta `json:"delta,omitempty"` | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a very trimmed down version of the response model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The complete response has a lot many parameters which would be important to run function calling etc. https://platform.openai.com/docs/api-reference/runs/createRun
stream = &AssistantThreadRunStream{ | ||
streamReader: resp, | ||
} | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return | |
return stream, nil |
withBetaAssistantVersion(c.config.AssistantVersion), | ||
) | ||
if err != nil { | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return | |
return nil, err |
stream = &AssistantThreadRunStream{ | ||
streamReader: resp, | ||
} | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return | |
return stream, nil |
@@ -137,6 +138,7 @@ type RunList struct { | |||
|
|||
type SubmitToolOutputsRequest struct { | |||
ToolOutputs []ToolOutput `json:"tool_outputs"` | |||
Stream bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stream bool | |
Stream bool `json:"stream,omitempty"` |
@@ -88,6 +88,7 @@ type RunRequest struct { | |||
AdditionalInstructions string `json:"additional_instructions,omitempty"` | |||
Tools []Tool `json:"tools,omitempty"` | |||
Metadata map[string]any `json:"metadata,omitempty"` | |||
Stream bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Stream bool | |
Stream bool `json:"stream,omitempty"` |
This update enhances the existing functionality by adding support for Assistant stream mode and stream event callbacks without introducing any breaking changes or modifications to the existing API endpoints.
The implemented features are in line with the OpenAI Assistant streaming API, as documented here: OpenAI Assistant Streaming API