Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues with Streaming #4

Closed
gspears00 opened this issue Mar 22, 2023 · 11 comments
Closed

Issues with Streaming #4

gspears00 opened this issue Mar 22, 2023 · 11 comments
Assignees
Labels
bug Something isn't working

Comments

@gspears00
Copy link

Thank you so much for this incredible library. I am trying to use this in a console based, streaming example. I can create a Chat, and get all data back in one return message. However when I try to use streaming, I get an error. The following console code works fine. I submit my chat, and I get the entire answer back in one "event". I would like the same behavior as the ChatGPT website, so the tokens would be displayed as they are generated. My code is as follows...

var buf : TStringlist;
begin
...
var Chat := OpenAI.Chat.Create(
procedure(Params: TChatParams)
begin
Params.Messages([TChatMessageBuild.Create(TMessageRole.User, Buf.Text)]);
Params.MaxTokens(1024);
// Params.Stream(True);
end);
try
for var Choice in Chat.Choices do
begin

            Buf.Add(Choice.Message.Content);
            Writeln(Choice.Message.Content);
          end;
    finally
     Chat.Free;
  end;

This code works. When I try to turn on streaming, I get the EConversionError 'The input value is not a valid Object', which causes ChatGPT to return 'Empty or Invalid Response'. Any ideas appreciated.

@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

Because it responds in this case not with a json object, but in its own special format.

Example

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": "\r", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": "\n", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": "1", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 2", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 3", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 4", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 5", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 6", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 7", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 8", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 9", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": ",", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: {"id": "cmpl-6wsVxtkU0TZrRAm4xPf5iTxyw9CTf", "object": "text_completion", "created": 1679490597, "choices": [{"text": " 10", "index": 0, "logprobs": null, "finish_reason": null}], "model": "text-davinci-003"}

data: [DONE]

@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

You can choose not to use this mode at all and simply output data by manually separating it.

@gspears00
Copy link
Author

Thank you for your response. In my sample code above, when I turn on streaming, (by uncommenting the line) the OpenAI.Chat.Create call fails after a few moments. It is getting sent to the Chat website, but it apparently is not in the proper format. I never get to the code within the TRY block in order to parse the response.

@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

OpenAI.Completion.CreateStream(
  procedure(Params: TCompletionParams)
  begin
    Params.Prompt(Buf.Text);
    Params.MaxTokens(1024);
    Params.Stream;
  end,
  procedure(Response: TStringStream)
  begin
    Writeln(Response.DataString);
    Writeln('-------');
    Sleep(100);
  end);

I experimentally did this

image

@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

The event is fired multiple times as data arrives in the stream. Each line (--------) is a new event in the stream response

@gspears00
Copy link
Author

gspears00 commented Mar 22, 2023

When I try to use the OpenAI.Completion.CreateStream, I am getting an error that CreateStream is an undeclared identifier. For my Uses clause, I am using:
System.SysUtils,
System.Classes,
OpenAI.Completions,
OpenAI.Chat,
OpenAI;

Do I need to add something else? I am using Delphi 11.1

@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

I haven't released it yet

@HemulGM HemulGM self-assigned this Mar 22, 2023
@HemulGM HemulGM added the bug Something isn't working label Mar 22, 2023
@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

Pushed a new method for working in stream mode

@HemulGM HemulGM closed this as completed Mar 22, 2023
@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

Example

OpenAI.Chat.CreateStream(
  procedure(Params: TChatParams)
  begin
    Params.Messages([TchatMessageBuild.User(Buf.Text)]);
    Params.MaxTokens(1024);
    Params.Stream;
  end,
  procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
  begin
    if (not IsDone) and Assigned(Chat) then
      Writeln(Chat.Choices[0].Delta.Content)
    else if IsDone then
      Writeln('DONE!');
    Writeln('-------');
    Sleep(100);
  end);

@HemulGM
Copy link
Owner

HemulGM commented Mar 22, 2023

But, for the same effect of writing a response sequentially, I would still use a regular query with a full answer and draw a conclusion by words manually. This is much easier to do than streaming.

@gspears00
Copy link
Author

Thank you so much. This is incredible. The reason that I am wanting streaming is that for complex answers, ChatGPT can take over 60 seconds. Within my app, the user would submit a request, and see nothing happen for 60 seconds. If I stream the answer, like the ChatGPT website does, the end result is that the user can start reading within a few seconds, when ChatGPT starts generating the answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants