-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues with Streaming #4
Comments
Because it responds in this case not with a json object, but in its own special format. Example
|
You can choose not to use this mode at all and simply output data by manually separating it. |
Thank you for your response. In my sample code above, when I turn on streaming, (by uncommenting the line) the OpenAI.Chat.Create call fails after a few moments. It is getting sent to the Chat website, but it apparently is not in the proper format. I never get to the code within the TRY block in order to parse the response. |
The event is fired multiple times as data arrives in the stream. Each line (--------) is a new event in the stream response |
When I try to use the OpenAI.Completion.CreateStream, I am getting an error that CreateStream is an undeclared identifier. For my Uses clause, I am using: Do I need to add something else? I am using Delphi 11.1 |
I haven't released it yet |
Pushed a new method for working in stream mode |
Example OpenAI.Chat.CreateStream(
procedure(Params: TChatParams)
begin
Params.Messages([TchatMessageBuild.User(Buf.Text)]);
Params.MaxTokens(1024);
Params.Stream;
end,
procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
begin
if (not IsDone) and Assigned(Chat) then
Writeln(Chat.Choices[0].Delta.Content)
else if IsDone then
Writeln('DONE!');
Writeln('-------');
Sleep(100);
end); |
But, for the same effect of writing a response sequentially, I would still use a regular query with a full answer and draw a conclusion by words manually. This is much easier to do than streaming. |
Thank you so much. This is incredible. The reason that I am wanting streaming is that for complex answers, ChatGPT can take over 60 seconds. Within my app, the user would submit a request, and see nothing happen for 60 seconds. If I stream the answer, like the ChatGPT website does, the end result is that the user can start reading within a few seconds, when ChatGPT starts generating the answer. |
Thank you so much for this incredible library. I am trying to use this in a console based, streaming example. I can create a Chat, and get all data back in one return message. However when I try to use streaming, I get an error. The following console code works fine. I submit my chat, and I get the entire answer back in one "event". I would like the same behavior as the ChatGPT website, so the tokens would be displayed as they are generated. My code is as follows...
var buf : TStringlist;
begin
...
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([TChatMessageBuild.Create(TMessageRole.User, Buf.Text)]);
Params.MaxTokens(1024);
// Params.Stream(True);
end);
try
for var Choice in Chat.Choices do
begin
This code works. When I try to turn on streaming, I get the EConversionError 'The input value is not a valid Object', which causes ChatGPT to return 'Empty or Invalid Response'. Any ideas appreciated.
The text was updated successfully, but these errors were encountered: